ERROR 1064 (42000) 过度分区语法中的数据库错误

2021-09-25 00:00:00 sql window-functions mysql mysql-5.7
mysql> select * from FinalTable;
+------+-------+-------+---------------------+
| id   | name  | state | timestamp           |
+------+-------+-------+---------------------+
|   12 | name1 | TX    | 2020-01-25 11:29:36 |
|   14 | name3 | CA    | 2020-01-25 11:29:36 |
|   14 | name3 | TX    | 2020-01-25 11:29:36 |
|   12 | name1 | CA   | 2020-01-25 11:29:36 |
|   13 | name2 | TA   | 2020-01-25 11:29:36 |
|   14 | name3 |  CA   | 2020-01-25 11:29:36 |
+------+-------+-------+---------------------+

我正在查看给出响应的输出查询:

I am looking at output query which gives response as:

I2 name1 TX 2020-01-25 11:29:36  CA 2020-01-25 11:29:36

当我运行查询时,

select id,name,state,timestamp,
lead(state,1) over (partition by id order by timestamp asc) out_state,
lead(timestamp,1) over (partition by id order by timestamp asc) out_timestamp
from FinalTable

ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '(partition by id order by timestamp asc) out_state,
lead(timestamp,1) over (part' at line 2

还可以在数据库中创建高达毫秒而不是秒的时间戳吗?我正在使用 CURRENT_TIMESTAMP.

also is it possible to create timetamp upto milliseconds instead of seconds in DB? I am using CURRENT_TIMESTAMP.

推荐答案

窗口函数(例如 lead())仅在 MySQL 8.0 中添加,因此它们在 5.7 版中不可用.您可以像这样使用自连接来模拟 lead():

Window functions (such as lead()) were added in MySQL 8.0 only, so they are not available in version 5.7. You can emulate lead() with a self-join like so:

select t.*, tlead.state, tlead.timestamp
from FinalTable t
left join FinalTable tlead 
    on tlead .id = t.id
    and tlead.timestamp = (
        select min(t1.timestamp) 
        from FinalTable t1 
        where t1.id = t.id and t1.timestamp > t.timestamp
    )

旁注:为了使该方法正常工作,您需要相同id的后续记录具有不同的timestamp - 这在示例数据中并非如此您展示的所有时间戳都相同(我认为这是您的示例数据中的拼写错误).

Side note: for this method to work properly, you need subsequent records of the same id to have different timestamps - which is not the case in the sample data that you showed, where all timestamps are the same (I assume this is a typo in your sample data).

相关文章