MySQL之——GROUPBY分组取字段最⼤值假设有⼀个业务场景,需要查询⽤户登录记录信息,其中表结构如下:
CREATE TABLE `tb` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`uid` int(11) NOT NULL,
`ip` varchar(16) NOT NULL,
`login_time` datetime,
PRIMARY KEY (`id`),
KEY (`uid`)
);
再来点测试数据:
INSERT INTO tb SELECT null, 1001, '192.168.1.1', '2017-01-21 16:30:47';
INSERT INTO tb SELECT null, 1003, '192.168.1.153', '2017-01-21 19:30:51';
INSERT INTO tb SELECT null, 1001, '192.168.1.61', '2017-01-21 16:50:41';
INSERT INTO tb SELECT null, 1002, '192.168.1.31', '2017-01-21 18:30:21';
INSERT INTO tb SELECT null, 1002, '192.168.1.66', '2017-01-21 19:12:32';
INSERT INTO tb SELECT null, 1001, '192.168.1.81', '2017-01-21 19:53:09';
INSERT INTO tb SELECT null, 1001, '192.168.1.231', '2017-01-21 19:55:34';
表数据情况:
+----+------+---------------+---------------------+
| id | uid  | ip            | login_time          |
+----+------+---------------+---------------------+
| 1  | 1001 | 192.168.1.1  | 2017-01-21 16:30:47 |
| 2  | 1003 | 192.168.1.153 | 2017-01-21 19:30:51 |
| 3  | 1001 | 192.168.1.61  | 2017-01-21 16:50:41 |
| 4  | 1002 | 192.168.1.31  | 2017-01-21 18:30:21 |
| 5  | 1002 | 192.168.1.66  | 2017-01-21 19:12:32 |
| 6  | 1001 | 192.168.1.81  | 2017-01-21 19:53:09 |
| 7  | 1001 | 192.168.1.231 | 2017-01-21 19:55:34 |
+----+------+---------------+---------------------+
如果只需要针对⽤户查出其最后登录的时间,可以简单写出:
SELECT uid, max(login_time)
FROM tb
GROUP BY uid;
+------+---------------------+
| uid  | max(login_time)      |
+------+---------------------+
| 1001 | 2017-01-21 19:55:34 |
| 1002 | 2017-01-21 19:12:32 |
| 1003 | 2017-01-21 19:30:51 |
+------+---------------------+
若还需要查询⽤户最后登录时的其他信息,就不能⽤这种sql写了:
-- 错误写法
SELECT uid, ip, max(login_time)
FROM tb
GROUP BY uid;
-- 错误写法
这样的语句是⾮SQL标准的,虽然能够在MySQL数据库中执⾏成功,但返回的却是未知的 (如果sql_mode开启了only_full_group_by,则不会执⾏成功。)
可能ip字段会取uid分组前的第⼀个row的值,显然不是所需信息
写法1
写⼀个⼦查询:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time in (
SELECT max(login_time)
FROM tb
GROUP BY uid);
写法2
再或者换⼀个写法:
SELECT a.uid, a.ip, a.login_time
FROM tb a
WHERE a.login_time = (
SELECT max(login_time)groupby分组
FROM tb
WHERE a.uid = uid);
顺便测了⼀下
在5.6以前的版本中,写法②这条sql在⼤数据量的情况下,执⾏计划不理想,⽬测性能不佳。 在5.6及以后的版本中,写法②这条sql会快很多,执⾏计划也有了改变
5.5.50:
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| id | select_type        | table | type | possible_keys | key  | key_len | ref  | rows | Extra      |
+----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
| 1  | PRIMARY            | a    | ALL  | NULL            | NULL  | NULL      | NULL | 7    | Using where |
| 2  | DEPENDENT SUBQUERY | tb    | ALL  | uid          | NULL  | NULL      | NULL | 7    | Using where | +----+--------------------+-------+------+---------------+------+---------+------+------+-------------+
5.6.30:
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| id | select_type        | table  | type | possible_keys | key  | key_len | ref      | rows  | Extra      |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
| 1  | PRIMARY            | a    | ALL  | NULL              | NULL | NULL      | NULL        | 7    | Using where | | 2  | DEPENDENT SUBQUERY | tb    | ref  | uid          | uid  | 4      | test.a.uid | 1    | NULL          |
+----+--------------------+-------+------+---------------+------+---------+------------+------+-------------+
写法3
直接改成join性能会更加好:
SELECT a.uid, a.ip, a.login_time
FROM (SELECT uid, max(login_time) login_time
FROM tb
GROUP BY uid
) b JOIN tb a ON a.uid = b.uid AND a.login_time = b.login_time;
当然,结果都相同:
+------+---------------+---------------------+
| uid  | ip            | login_time          |
+------+---------------+---------------------+
| 1003 | 192.168.1.153 | 2017-01-21 19:30:51 |
| 1002 | 192.168.1.66  | 2017-01-21 19:12:32 |
| 1001 | 192.168.1.231 | 2017-01-21 19:55:34 |
+------+---------------+---------------------+
注:如果要分组取最⼩值直接改对应函数和符号就⾏了。

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。