将SQL Case语句转换为Spark [英] Convert SQL Case Statement into Spark

查看:249
本文介绍了将SQL Case语句转换为Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何将此SQL case语句转换为Spark SQL?

How do I convert this SQL case statement into Spark SQL?

 replace_old_engagements_sql = """ UPDATE """ + my_table_name + """
                            SET Engagement = CASE Engagement
                                                WHEN '800000026680' THEN '800000032764'
                                                WHEN '807000000041' THEN '808000000000'
                                                WHEN '870000012569' THEN '807000000412'
                                                WHEN '807000000279' THEN '808000000223'
                                                WHEN '807000000282' THEN '808000000223'
                                                WHEN '870000000403' THEN '808000000223'
                                            END
                            WHERE LinkedAccountId in ('123456789101','109876543212') AND Engagement IN ('800000026680', '807000000041', '870000012569', '807000000279', '807000000282', '870000000403'); """

推荐答案

我想您的spark sql可能与此差不多.

I guess your spark sql would be something close to this.

    spark.sql("""
INSERT OVERWRITE TABLE db.my_table_name
SELECT
CASE 
  WHEN LinkedAccountId in ('123456789101','109876543212') THEN
    CASE
    WHEN Engagement = '800000026680' THEN '800000032764'
    WHEN Engagement = '807000000041' THEN '808000000000'
    WHEN Engagement = '870000012569' THEN '807000000412'
    WHEN Engagement = '807000000279' THEN '808000000223'
    WHEN Engagement = '807000000282' THEN '808000000223'
    WHEN Engagement = '870000000403' THEN '808000000223'
    ELSE Engagement
    END
  ELSE Engagement
END as Engagement
from db.my_table_name
""")

这篇关于将SQL Case语句转换为Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆