用字典键值(pyspark)替换spark df中一列的值 [英] replace values of one column in a spark df by dictionary key-values (pyspark)
问题描述
我在pyspark中陷入了数据转换任务. 我想用字典中指定的键值对替换df中一列的所有值.
I got stucked with a data transformation task in pyspark. I want to replace all values of one column in a df with key-value-pairs specified in a dictionary.
dict = {'A':1, 'B':2, 'C':3}
我的df看起来像这样:
My df looks like this:
+-----------++-----------+
| col1|| col2|
+-----------++-----------+
| B|| A|
| A|| A|
| A|| A|
| C|| B|
| A|| A|
+-----------++-----------+
现在,我想用dict中定义的键值对替换col1的所有值.
Now I want to replace all values of col1 by the key-values pairs defined in dict.
所需的输出:
+-----------++-----------+
| col1|| col2|
+-----------++-----------+
| 2|| A|
| 1|| A|
| 1|| A|
| 3|| B|
| 1|| A|
+-----------++-----------+
我尝试了
df.na.replace(dict, 1).show()
但是这也会替换col2上的值,该值将保持不变.
but that also replaces the values on col2, which shall stay untouched.
感谢您的帮助. 问候:)
Thank you for your help. Greetings :)
推荐答案
您的数据:
print df
DataFrame[col1: string, col2: string]
df.show()
+----+----+
|col1|col2|
+----+----+
| B| A|
| A| A|
| A| A|
| C| B|
| A| A|
+----+----+
diz = {"A":1, "B":2, "C":3}
将字典的值从整数转换为字符串,以免出现替换不同类型的错误:
Convert values of your dictionary from integer to string, in order to not get errors of replacing different types:
diz = {k:str(v) for k,v in zip(diz.keys(),diz.values())}
print diz
{'A': '1', 'C': '3', 'B': '2'}
替换col1的值
df2 = df.na.replace(diz,1,"col1")
print df2
DataFrame[col1: string, col2: string]
df2.show()
+----+----+
|col1|col2|
+----+----+
| 2| A|
| 1| A|
| 1| A|
| 3| B|
| 1| A|
+----+----+
如果您需要将值从String强制转换为Integer
If you need to cast your values from String to Integer
from pyspark.sql.types import *
df3 = df2.select(df2["col1"].cast(IntegerType()),df2["col2"])
print df3
DataFrame[col1: int, col2: string]
df3.show()
+----+----+
|col1|col2|
+----+----+
| 2| A|
| 1| A|
| 1| A|
| 3| B|
| 1| A|
+----+----+
这篇关于用字典键值(pyspark)替换spark df中一列的值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!