如何在SPARK SQL中使用LEFT和RIGHT关键字 [英] How to use LEFT and RIGHT keyword in SPARK SQL
本文介绍了如何在SPARK SQL中使用LEFT和RIGHT关键字的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我刚开始使用SQL
在MS SQL中,我们有LEFT关键字LEFT(Columnname,1) in('D','A') then 1 else 0
.
In MS SQL, we have LEFT keyword, LEFT(Columnname,1) in('D','A') then 1 else 0
.
如何在SPARK SQL中实现相同的功能.请指导我
How to implement the same in SPARK SQL. Kindly guide me
推荐答案
您可以将substring
函数与正pos
一起使用,以从左侧获取:
You can use substring
function with positive pos
to take from the left:
import org.apache.spark.sql.functions.substring
substring(column, 0, 1)
和否定的pos
从右边获取:
substring(column, -1, 1)
因此,在Scala中,您可以定义
So in Scala you can define
import org.apache.spark.sql.Column
import org.apache.spark.sql.functions.substring
def left(col: Column, n: Int) = {
assert(n >= 0)
substring(col, 0, n)
}
def right(col: Column, n: Int) = {
assert(n >= 0)
substring(col, -n, n)
}
val df = Seq("foobar").toDF("str")
df.select(
Seq(left _, right _).flatMap(f => (1 to 3).map(i => f($"str", i))): _*
).show
+--------------------+--------------------+--------------------+---------------------+---------------------+---------------------+
|substring(str, 0, 1)|substring(str, 0, 2)|substring(str, 0, 3)|substring(str, -1, 1)|substring(str, -2, 2)|substring(str, -3, 3)|
+--------------------+--------------------+--------------------+---------------------+---------------------+---------------------+
| f| fo| foo| r| ar| bar|
+--------------------+--------------------+--------------------+---------------------+---------------------+---------------------+
类似地在Python中:
Similarly in Python:
from pyspark.sql.functions import substring
from pyspark.sql.column import Column
def left(col, n):
assert isinstance(col, (Column, str))
assert isinstance(n, int) and n >= 0
return substring(col, 0, n)
def right(col, n):
assert isinstance(col, (Column, str))
assert isinstance(n, int) and n >= 0
return substring(col, -n, n)
这篇关于如何在SPARK SQL中使用LEFT和RIGHT关键字的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文