PySpark组中的中位数/分位数 [英] Median / quantiles within PySpark groupBy
问题描述
我想在Spark数据帧上计算组分位数(使用PySpark).近似或精确的结果都可以.我更喜欢可以在groupBy
/agg
上下文中使用的解决方案,以便可以将其与其他PySpark聚合函数混合使用.如果由于某种原因无法实现,那么也可以使用其他方法.
I would like to calculate group quantiles on a Spark dataframe (using PySpark). Either an approximate or exact result would be fine. I prefer a solution that I can use within the context of groupBy
/ agg
, so that I can mix it with other PySpark aggregate functions. If this is not possible for some reason, a different approach would be fine as well.
这个问题是相关的,但没有指出如何将approxQuantile
用作聚合函数.
This question is related but does not indicate how to use approxQuantile
as an aggregate function.
我也可以访问percentile_approx
Hive UDF,但是我不知道如何将其用作聚合函数.
I also have access to the percentile_approx
Hive UDF but I don't know how to use it as an aggregate function.
为特定起见,假设我具有以下数据框:
For the sake of specificity, suppose I have the following dataframe:
from pyspark import SparkContext
import pyspark.sql.functions as f
sc = SparkContext()
df = sc.parallelize([
['A', 1],
['A', 2],
['A', 3],
['B', 4],
['B', 5],
['B', 6],
]).toDF(('grp', 'val'))
df_grp = df.groupBy('grp').agg(f.magic_percentile('val', 0.5).alias('med_val'))
df_grp.show()
预期结果是:
+----+-------+
| grp|med_val|
+----+-------+
| A| 2|
| B| 5|
+----+-------+
推荐答案
我想您不再需要它了.但是会将其留给后代使用(例如,下周我忘了我).
I guess you don't need it anymore. But will leave it here for future generations (i.e. me next week when I forget).
from pyspark.sql import Window
import pyspark.sql.functions as F
grp_window = Window.partitionBy('grp')
magic_percentile = F.expr('percentile_approx(val, 0.5)')
df.withColumn('med_val', magic_percentile.over(grp_window))
或者确切地解决您的问题,这也可以:
Or to address exactly your question, this also works:
df.groupBy('grp').agg(magic_percentile.alias('med_val'))
作为奖励,您可以传递一组百分位数:
And as a bonus, you can pass an array of percentiles:
quantiles = F.expr('percentile_approx(val, array(0.25, 0.5, 0.75))')
然后您会得到一个列表.
And you'll get a list in return.
这篇关于PySpark组中的中位数/分位数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!