如何获取pyspark数据框中具有最大值的列的名称 [英] how to get the name of column with maximum value in pyspark dataframe
本文介绍了如何获取pyspark数据框中具有最大值的列的名称的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我们如何获取列pyspark数据框的名称?
How do we get the name of the column pyspark dataframe ?
Alice Eleonora Mike Helen MAX
0 2 7 8 6 Mike
1 11 5 9 4 Alice
2 6 15 12 3 Eleonora
3 5 3 7 8 Helen
我需要这样的东西。列名没有最大值,我能够获得最大值,我需要名称
I need something like this. name of the columns no the max values, i am able to get the max values, i need the name
推荐答案
您可以链接查找哪些列等于最大值的条件:
You can chain conditions to find which columns is equal to the maximum value:
cond = "psf.when" + ".when".join(["(psf.col('" + c + "') == psf.col('max_value'), psf.lit('" + c + "'))" for c in df.columns])
import pyspark.sql.functions as psf
df.withColumn("max_value", psf.greatest(*df.columns))\
.withColumn("MAX", eval(cond))\
.show()
+-----+--------+----+-----+---------+--------+
|Alice|Eleonora|Mike|Helen|max_value| MAX|
+-----+--------+----+-----+---------+--------+
| 2| 7| 8| 6| 8| Mike|
| 11| 5| 9| 4| 11| Alice|
| 6| 15| 12| 3| 15|Eleonora|
| 5| 3| 7| 8| 8| Helen|
+-----+--------+----+-----+---------+--------+
或:爆炸并过滤
df.withColumn("max_value", psf.greatest(*df.columns))\
.select("*", psf.posexplode(psf.create_map(list(chain(*[(psf.lit(c), psf.col(c)) for c in df.columns])))))\
.filter("max_value = value")\
.select(df.columns + [psf.col("key").alias("MAX")])\
.show()
或:使用 UDF
在字典上:
from pyspark.sql.types import *
argmax_udf = psf.udf(lambda m: max(m, key=m.get), StringType())
df.withColumn("map", psf.create_map(list(chain(*[(psf.lit(c), psf.col(c)) for c in df.columns]))))\
.withColumn("MAX", argmax_udf("map"))\
.drop("map")\
.show()
或: 通过参数使用 UDF
:
from pyspark.sql.types import *
def argmax(cols, *args):
return [c for c, v in zip(cols, args) if v == max(args)][0]
argmax_udf = lambda cols: psf.udf(lambda *args: argmax(cols, *args), StringType())
df.withColumn("MAX", argmax_udf(df.columns)(*df.columns))\
.show()
这篇关于如何获取pyspark数据框中具有最大值的列的名称的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文