如何计算下一行的当前行? [英] How to calculate the current row with the next one?
问题描述
在使用DataFrame
s的Spark-Sql 1.6版中,有没有一种方法可以为特定列计算当前行与下一行的总和?
In Spark-Sql version 1.6, using DataFrame
s, is there a way to calculate, for a specific column, the sum of the current row and the next one, for every row?
例如,如果我有一个只有一列的表,就像这样
For example, if I have a table with one column, like so
Age
12
23
31
67
我想要以下输出
Sum
35
54
98
由于没有要添加的下一行",因此最后一行被删除.
The last row is dropped because it has no "next row" to be added to.
现在,我通过对表进行排名并将其与自身连接在一起来实现它,其中rank
等于rank+1
.
Right now I am doing it by ranking the table and joining it with itself, where the rank
is equals to rank+1
.
是否有更好的方法可以做到这一点?
可以使用Window
函数吗?
Is there a better way to do this?
Can this be done with a Window
function?
推荐答案
是的,肯定可以使用rowsBetween
函数使用Window
函数.在下面的示例中,我将person
列用于grouping
用途.
Yes definitely you can do with Window
function by using rowsBetween
function. I have used person
column for grouping
purpose in my following example.
import sqlContext.implicits._
import org.apache.spark.sql.functions._
val dataframe = Seq(
("A",12),
("A",23),
("A",31),
("A",67)
).toDF("person", "Age")
val windowSpec = Window.partitionBy("person").orderBy("Age").rowsBetween(0, 1)
val newDF = dataframe.withColumn("sum", sum(dataframe("Age")) over(windowSpec))
newDF.filter(!(newDF("Age") === newDF("sum"))).show
这篇关于如何计算下一行的当前行?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!