使用窗口函数计算 PySpark 中的累积总和 [英] Calculating Cumulative sum in PySpark using Window Functions

查看:109
本文介绍了使用窗口函数计算 PySpark 中的累积总和的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下示例数据帧:

rdd = sc.parallelize([(1,20), (2,30), (3,30)])
df2 = spark.createDataFrame(rdd, ["id", "duration"])
df2.show()

+---+--------+
| id|duration|
+---+--------+
|  1|      20|
|  2|      30|
|  3|      30|
+---+--------+

我想按持续时间的降序对此 DataFrame 进行排序,并添加一个具有持续时间累积总和的新列.所以我做了以下事情:

I want to sort this DataFrame in desc order of duration and add a new column which has the cumulative sum of the duration. So I did the following:

windowSpec = Window.orderBy(df2['duration'].desc())

df_cum_sum = df2.withColumn("duration_cum_sum", sum('duration').over(windowSpec))

df_cum_sum.show()

+---+--------+----------------+
| id|duration|duration_cum_sum|
+---+--------+----------------+
|  2|      30|              60|
|  3|      30|              60|
|  1|      20|              80|
+---+--------+----------------+

我想要的输出是:

+---+--------+----------------+
| id|duration|duration_cum_sum|
+---+--------+----------------+
|  2|      30|              30| 
|  3|      30|              60| 
|  1|      20|              80|
+---+--------+----------------+

我怎么得到这个?

这里是细分:

+--------+----------------+
|duration|duration_cum_sum|
+--------+----------------+
|      30|              30| #First value
|      30|              60| #Current duration + previous cum sum value
|      20|              80| #Current duration + previous cum sum value     
+--------+----------------+

推荐答案

你可以引入 row_number 来打破僵局;如果写成sql:

You can introduce the row_number to break the ties; If written in sql:

df2.selectExpr(
    "id", "duration", 
    "sum(duration) over (order by row_number() over (order by duration desc)) as duration_cum_sum"
 ).show()

+---+--------+----------------+
| id|duration|duration_cum_sum|
+---+--------+----------------+
|  2|      30|              30|
|  3|      30|              60|
|  1|      20|              80|
+---+--------+----------------+

这篇关于使用窗口函数计算 PySpark 中的累积总和的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆