Apache Spark 处理 case 语句 [英] Apache spark dealing with case statements
问题描述
我正在处理将 SQL 代码转换为 PySpark 代码并遇到一些 SQL 语句.我不知道如何处理 pyspark 中的案例陈述?我打算创建一个 RDD,然后使用 rdd.map,然后做一些逻辑检查.这是正确的方法吗?请帮忙!
I am dealing with transforming SQL code to PySpark code and came across some SQL statements. I don't know how to approach case statments in pyspark? I am planning on creating a RDD and then using rdd.map and then do some logic checks. Is that the right approach? Please help!
基本上我需要遍历 RDD 或 DF 中的每一行,并根据一些逻辑我需要编辑其中一个列值.
Basically I need to go through each line in the RDD or DF and based on some logic I need to edit one of the column values.
case
when (e."a" Like 'a%' Or e."b" Like 'b%')
And e."aa"='BW' And cast(e."abc" as decimal(10,4))=75.0 Then 'callitA'
when (e."a" Like 'b%' Or e."b" Like 'a%')
And e."aa"='AW' And cast(e."abc" as decimal(10,4))=75.0 Then 'callitB'
else
'CallitC'
推荐答案
If-Else
/When-Then-Else
/ 几种写法
表达式.pyspark
中的 When-Otherwise
These are few ways to write If-Else
/ When-Then-Else
/ When-Otherwise
expression in pyspark
.
示例数据框
df = spark.createDataFrame([(1,1),(2,2),(3,3)],['id','value'])
df.show()
#+---+-----+
#| id|value|
#+---+-----+
#| 1| 1|
#| 2| 2|
#| 3| 3|
#+---+-----+
#Desired Output:
#+---+-----+----------+
#| id|value|value_desc|
#+---+-----+----------+
#| 1| 1| one|
#| 2| 2| two|
#| 3| 3| other|
#+---+-----+----------+
选项#1:withColumn()
使用 when-otherwise
Option#1: withColumn()
using when-otherwise
from pyspark.sql.functions import when
df.withColumn("value_desc",when(df.value == 1, 'one').when(df.value == 2, 'two').otherwise('other')).show()
选项#2:select()
使用 when-otherwise
Option#2: select()
using when-otherwise
from pyspark.sql.functions import when
df.select("*",when(df.value == 1, 'one').when(df.value == 2, 'two').otherwise('other').alias('value_desc')).show()
Option3: selectExpr()
使用 SQL 等效 CASE 表达式
Option3: selectExpr()
using SQL equivalent CASE expression
df.selectExpr("*","CASE WHEN value == 1 THEN 'one' WHEN value == 2 THEN 'two' ELSE 'other' END AS value_desc").show()
SQL like 表达式也可以使用 pyspark.sql.functions.expr 函数.以下是示例.
SQL like expression can also be written in withColumn()
and select()
using pyspark.sql.functions.expr function. Here are examples.
Option4: select()
使用 expr 函数
Option4: select()
using expr function
from pyspark.sql.functions import expr
df.select("*",expr("CASE WHEN value == 1 THEN 'one' WHEN value == 2 THEN 'two' ELSE 'other' END AS value_desc")).show()
Option5: withColumn()
使用 expr 函数
Option5: withColumn()
using expr function
from pyspark.sql.functions import expr
df.withColumn("value_desc",expr("CASE WHEN value == 1 THEN 'one' WHEN value == 2 THEN 'two' ELSE 'other' END AS value_desc")).show()
输出:
#+---+-----+----------+
#| id|value|value_desc|
#+---+-----+----------+
#| 1| 1| one|
#| 2| 2| two|
#| 3| 3| other|
#+---+-----+----------+
这篇关于Apache Spark 处理 case 语句的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!