TypeError: 'Column' 对象不能使用 WithColumn 调用 [英] TypeError: 'Column' object is not callable using WithColumn
本文介绍了TypeError: 'Column' 对象不能使用 WithColumn 调用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想在函数 get_distance
的数据框df"上附加一个新列:
I would like append a new column on dataframe "df" from function get_distance
:
def get_distance(x, y):
dfDistPerc = hiveContext.sql("select column3 as column3, \
from tab \
where column1 = '" + x + "' \
and column2 = " + y + " \
limit 1")
result = dfDistPerc.select("column3").take(1)
return result
df = df.withColumn(
"distance",
lit(get_distance(df["column1"], df["column2"]))
)
但是,我明白了:
TypeError: 'Column' object is not callable
我认为这是因为 x 和 y 是 Column
对象,我需要转换为 String
以在我的查询中使用.我对吗?如果是这样,我该怎么做?
I think it happens because x and y are Column
objects and I need to be converted to String
to use in my query. Am I right? If so, how can I do this?
推荐答案
Spark 应该知道你使用的函数不是普通函数而是 UDF.
Spark should know the function that you are using is not ordinary function but the UDF.
因此,我们可以通过两种方式在数据帧上使用 UDF.
So, there are 2 ways by which we can use the UDF on dataframes.
方法一:使用@udf注解
@udf
def get_distance(x, y):
dfDistPerc = hiveContext.sql("select column3 as column3, \
from tab \
where column1 = '" + x + "' \
and column2 = " + y + " \
limit 1")
result = dfDistPerc.select("column3").take(1)
return result
df = df.withColumn(
"distance",
lit(get_distance(df["column1"], df["column2"]))
)
方法 2:使用 pyspark.sql.functions.udf 注册 udf
def get_distance(x, y):
dfDistPerc = hiveContext.sql("select column3 as column3, \
from tab \
where column1 = '" + x + "' \
and column2 = " + y + " \
limit 1")
result = dfDistPerc.select("column3").take(1)
return result
calculate_distance_udf = udf(get_distance, IntegerType())
df = df.withColumn(
"distance",
lit(calculate_distance_udf(df["column1"], df["column2"]))
)
这篇关于TypeError: 'Column' 对象不能使用 WithColumn 调用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文