我想得到加速度的标准偏差,除以reg,date和hour.因此,我希望使用以下代码根据这些分区得到一个统一的值,但每行得到的值不同:

SELECT *,
       STDDEV(ACCELERATION)OVER(PARTITION BY REG, DATE, HOUR ORDER BY TIMESTAMP)
  
FROM 

(
  
SELECT 
       *,
       SPEED_DIFFERENCE/ TIMESTAMP_DIFFERENCE AS ACCELERATION

FROM 

(



SELECT *,
       NEXT_MILES_PER_HOUR - MILES_PER_HOUR AS SPEED_DIFFERENCE,
       TIMESTAMPDIFF(second,timestamp,Next_timestamp) AS TIMESTAMP_DIFFERENCE
       
FROM

(

SELECT *, 
       TO_TIME (DATE_AND_HOUR) AS HOUR,
       LEAD(TIMESTAMP)OVER (PARTITION BY VIN ORDER BY TIMESTAMP) AS NEXT_TIMESTAMP,
       LEAD(MILES_PER_HOUR)OVER (PARTITION BY REG ORDER BY TIMESTAMP) AS NEXT_MILES_PER_HOUR

FROM

(

SELECT Reg, 
       TIMESTAMP,
       Miles_per_hour
       TO_DATE (TIMESTAMP) AS DATE,
       TO_TIME (TIMESTAMP) AS TIME,
       date_trunc('HOUR', TIMESTAMP) as DATE_AND_Hour,
       DATA
       
FROM Motorycle_Data_Refurbished_Models_DuPage_County

)
)
)
)

推荐答案

要获得每个分区的单个值,应删除ORDER BY,因为它将函数切换为累加函数:

SELECT *,
       STDDEV(ACCELERATION)OVER(PARTITION BY REG, DATE, HOUR)
...

例子:

CREATE  OR REPLACE TABLE t(i INTEGER);
INSERT INTO t (i) VALUES  (6), (10), (14);

SELECT STDDEV(i) OVER(ORDER BY i) FROM t;

输出:

enter image description here

vs.

SELECT STDDEV(i) OVER() FROM t;

输出:

enter image description here


应用于STDDEV时的排序依据:

如果OVER子句包含ORDER BY子条款,则:

需要窗框.如果未明确指定窗口框架,则ORDER BY表示累积窗口框架:

   RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW

窗口正在从分区的开头移动到当前行.

可以覆盖它:

SELECT STDDEV(i) OVER(ORDER BY i 
                      RANGE BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING) 
FROM t;

输出:

enter image description here

Sql相关问答推荐

SQL从同一表连接列

按postquist中的日期查询json列

Microsoft Access UNION将长文本字段限制为255个字符

Snowflake SQL比较克隆x原始计数

Postgresql:从jsons数组到单个id索引的json

基于模式或其他行集的数据复制

从列的不同值创建列

为什么在这种情况下我不能使用LAG函数?

你能过滤一个列表只返回多个结果吗?

属于(日期)范围类型及其交集的总权重​

使用SQL数据库中的现有列派生或修改几个列

如何在多列上编写具有不同条件的查询?

如何根据 SQL 中的阈值标记一个集群中的所有值?

用替代方案替换 SQL Cursor 以提高性能

删除每个不同日期中未 Select 的最短最长时间

按所选的值将记录分组到不同的列中

为重复的项目编号显示正在处理

String_Split 多列

Set vs let vs 在snowflake中声明变量

pyspark 将列转换为行