将列值转换为 pyspark 数据框中的列 [英] transform columns values to columns in pyspark dataframe

查看:96
本文介绍了将列值转换为 pyspark 数据框中的列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在数据块上的 pyspark 中将一列的值转换为数据帧的多列.

I would like to transform the values of a column into multiple columns of a dataframe in pyspark on databricks.

例如

from pyspark.sql import SparkSession

spark = SparkSession.builder.getOrCreate()

df = spark._sc.parallelize([["dapd", "shop", "retail"],
    ["dapd", "shop", "on-line"],
    ["dapd", "payment", "credit"],
    ["wrfr", "shop", "supermarket"],
    ["wrfr", "shop", "brand store"],
    ["wrfr", "payment", "cash"]]).toDF(["id", "value1", "value2"])

我需要将其转换为:

id,     shop                       payment
dapd    retail|on-line             credit
wrfr    supermarket|brand store    cash

我不确定如何在 pyspark 中做到这一点?

I am not sure how I can do this in pyspark ?

谢谢,

推荐答案

pivot 和聚合函数的组合,例如 collect_list()collect_set().在此处查看可用的聚合函数:https://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=agg#module-pyspark.sql.functions.下面是一些代码示例:

What you're looking for a a combination of pivot and aggregation functions, such as collect_list() or collect_set(). Have a look at the available aggregation functions here: https://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=agg#module-pyspark.sql.functions. Here's some code example:

from pyspark.sql import SparkSession
import pyspark.sql.functions as f

df = spark._sc.parallelize([
    ["dapd", "shop", "retail"],
    ["dapd", "shop", "on-line"],
    ["dapd", "payment", "credit"],
    ["wrfr", "shop", "supermarket"],
    ["wrfr", "shop", "brand store"],
    ["wrfr", "payment", "cash"]]
).toDF(["id", "value1", "value2"])

df.show()
+----+-------+-----------+
|  id| value1|     value2|
+----+-------+-----------+
|dapd|   shop|     retail|
|dapd|   shop|    on-line|
|dapd|payment|     credit|
|wrfr|   shop|supermarket|
|wrfr|   shop|brand store|
|wrfr|payment|       cash|
+----+-------+-----------+


df.groupBy('id').pivot('value1').agg(f.collect_list("value2")).show(truncate=False)
+----+--------+--------------------------+
|id  |payment |shop                      |
+----+--------+--------------------------+
|dapd|[credit]|[retail, on-line]         |
|wrfr|[cash]  |[supermarket, brand store]|
+----+--------+--------------------------+

这篇关于将列值转换为 pyspark 数据框中的列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆