如何使用Java在Spark SQL中联接多个列以在DataFrame中进行过滤 [英] How to Join Multiple Columns in Spark SQL using Java for filtering in DataFrame
本文介绍了如何使用Java在Spark SQL中联接多个列以在DataFrame中进行过滤的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
-
DataFrame a
=包含x,y,z,k列 -
DataFrame b
=包含列x,y,a
DataFrame a
= contains column x,y,z,kDataFrame b
= contains column x,y,a
a.join(b,<condition to use in java to use x,y >) ???
我尝试使用
a.join(b,a.col("x").equalTo(b.col("x")) && a.col("y").equalTo(b.col("y"),"inner")
但是Java抛出错误,提示不允许使用&&
.
But Java is throwing error saying &&
is not allowed.
推荐答案
Spark SQL在Column
上标记为java_expr_ops
的一组方法专用于Java互操作性.它包括 and
(另请参见
Spark SQL provides a group of methods on Column
marked as java_expr_ops
which are designed for Java interoperability. It includes and
(see also or
) method which can be used here:
a.col("x").equalTo(b.col("x")).and(a.col("y").equalTo(b.col("y"))
这篇关于如何使用Java在Spark SQL中联接多个列以在DataFrame中进行过滤的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文