mysql 如何 groupby 按天来统计某个字段的 sum 值

2017-08-04 17:38:33 +08:00
 solaro

mysql> select *, from_unixtime(log_time) from user_traffic_log where user_id = 43 order by id desc limit 30; +------+---------+--------+----------+---------+------+-----------+------------+---------------------+---------------------+-------------------------+ | id | user_id | u | d | node_id | rate | traffic | log_time | created_at | updated_at | from_unixtime(log_time) | +------+---------+--------+----------+---------+------+-----------+------------+---------------------+---------------------+-------------------------+ | 1229 | 43 | 6809 | 97141 | 3 | 1 | 101.51KB | 1501837835 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 17:10:35 | | 1225 | 43 | 5592 | 641620 | 1 | 1 | 632.04KB | 1501837436 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 17:03:56 | | 1205 | 43 | 70283 | 957250 | 2 | 1 | 1003.45KB | 1501836235 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:43:55 | | 1200 | 43 | 56222 | 2845833 | 1 | 1 | 2.77MB | 1501835995 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:39:55 | | 1194 | 43 | 74216 | 2099903 | 3 | 1 | 2.07MB | 1501835370 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:29:30 | | 1191 | 43 | 19158 | 5012670 | 1 | 1 | 4.8MB | 1501835274 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:27:54 | | 1189 | 43 | 50972 | 5185555 | 1 | 1 | 4.99MB | 1501835214 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:26:54 | | 1185 | 43 | 57346 | 5559680 | 2 | 1 | 5.36MB | 1501835153 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:25:53 | | 1183 | 43 | 176168 | 536972 | 1 | 1 | 696.43KB | 1501835094 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:24:54 | | 1182 | 43 | 10261 | 5291157 | 3 | 1 | 5.06MB | 1501835070 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:24:30 | | 1181 | 43 | 8071 | 1696844 | 3 | 1 | 1664.96KB | 1501835009 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:23:29 | | 1171 | 43 | 4241 | 3060157 | 2 | 1 | 2.92MB | 1501834432 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:13:52 | | 1162 | 43 | 3968 | 4280221 | 3 | 1 | 4.09MB | 1501833927 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:05:27 | | 1160 | 43 | 16919 | 6623537 | 3 | 1 | 6.33MB | 1501833867 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:04:27 | | 1158 | 43 | 20170 | 7442167 | 2 | 1 | 7.12MB | 1501833831 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:03:51 | | 1157 | 43 | 226559 | 1899232 | 3 | 1 | 2.03MB | 1501833807 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:03:27 | | 1154 | 43 | 0 | 14568929 | 2 | 1 | 13.89MB | 1501833771 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:02:51 | | 1152 | 43 | 0 | 22722588 | 2 | 1 | 21.67MB | 1501833711 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:01:51 | | 1151 | 43 | 27216 | 20017926 | 2 | 1 | 19.12MB | 1501833651 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 16:00:51 | | 1150 | 43 | 162879 | 4914082 | 1 | 1 | 4.84MB | 1501833594 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:59:54 | | 1149 | 43 | 15674 | 21161488 | 2 | 1 | 20.2MB | 1501833591 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:59:51 | | 1148 | 43 | 18395 | 2207526 | 2 | 1 | 2.12MB | 1501833531 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:58:51 | | 1144 | 43 | 0 | 2939218 | 2 | 1 | 2.8MB | 1501833470 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:57:50 | | 1140 | 43 | 28098 | 5464999 | 2 | 1 | 5.24MB | 1501833410 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:56:50 | | 1138 | 43 | 69068 | 4552818 | 2 | 1 | 4.41MB | 1501833350 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:55:50 | | 1137 | 43 | 27594 | 10257535 | 3 | 1 | 9.81MB | 1501833326 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:55:26 | | 1134 | 43 | 54264 | 47546684 | 3 | 1 | 45.4MB | 1501833266 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:54:26 | | 1125 | 43 | 0 | 23892102 | 2 | 1 | 22.79MB | 1501832389 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:39:49 | | 1122 | 43 | 81347 | 1652575 | 1 | 1 | 1693.28KB | 1501832333 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:38:53 | | 1121 | 43 | 46683 | 19336063 | 2 | 1 | 18.48MB | 1501832329 | 0000-00-00 00:00:00 | 0000-00-00 00:00:00 | 2017-08-04 15:38:49 | +------+---------+--------+----------+---------+------+-----------+------------+---------------------+---------------------+-------------------------+ 30 rows in set (0.00 sec)

每一天都有大量的数据,然后我就懵逼了, 我的需求:统计最近 30 天,把每天都 sum 一下,难道要在 sql 里用 if。。。

3302 次点击
所在节点    问与答
11 条回复
solaro
2017-08-04 17:39:24 +08:00
liprais
2017-08-04 17:40:45 +08:00
1.mysql 有 to_date 可以把 timestamp 转换成日期
2.可以每天算一下,30 天加起来就好了
solaro
2017-08-04 17:42:48 +08:00
@liprais 我表达错了,不是要 sum 30 天的,是每天都有一大堆的记录,我要 sum 每天的,然后取出最近 30 天内的
liprais
2017-08-04 17:46:48 +08:00
@solaro 按天 group by 呗
solaro
2017-08-04 17:48:38 +08:00
@liprais 怎么写啊,,,不会啊。。
liprais
2017-08-04 17:52:11 +08:00
select date(from_unixtime(log_time)),sum(whatever you want to sum) from your table group by date(from_unixtime(log_time));
johnny23
2017-08-04 17:52:13 +08:00
你设计表的时候 可以多一个冗余字段存年月日 就可以很简单的用 groupby 了 或者试试用临时表
solaro
2017-08-04 17:56:50 +08:00
@liprais
@johnny23
主要是表结构现在没办法动了。
solaro
2017-08-04 17:57:29 +08:00
@johnny23 如果直接查无法查出来,我就得写个脚本每天晚上定时统计一遍数据。
solaro
2017-08-04 17:59:45 +08:00
@liprais 爱你么么哒,太感谢了,可以了
johnny23
2017-08-04 18:02:10 +08:00
@liprais 厉害

这是一个专为移动设备优化的页面(即为了让你能够在 Google 搜索结果里秒开这个页面),如果你希望参与 V2EX 社区的讨论,你可以继续到 V2EX 上打开本讨论主题的完整版本。

https://www.v2ex.com/t/380477

V2EX 是创意工作者们的社区,是一个分享自己正在做的有趣事物、交流想法,可以遇见新朋友甚至新机会的地方。

V2EX is a community of developers, designers and creative people.

© 2021 V2EX