如何在Spark 2.0中启用笛卡尔联接? [英] How to enable Cartesian join in Spark 2.0?

查看:264
本文介绍了如何在Spark 2.0中启用笛卡尔联接?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我必须在Spark 2.0中交叉连接2个数据帧,遇到以下错误:

I have to cross join 2 dataframe in Spark 2.0 I am encountering below error:

用户类抛出异常:

org.apache.spark.sql.AnalysisException: Cartesian joins could be prohibitively expensive and are disabled by default. To explicitly enable them, please set spark.sql.crossJoin.enabled = true; 

请帮助我在哪里设置此配置,我正在eclipse中进行编码.

Please help me where to set this configuration, I am coding in eclipse.

推荐答案

错误消息明确指出,您需要将spark.sql.crossJoin.enabled = true设置为spark配置

As the error message clearly states you need to set spark.sql.crossJoin.enabled = true to your spark configuration

您可以设置如下所示的内容:

You can set the same something like below:

val sparkConf = new SparkConf().setAppName("Test")
sparkConf.set("spark.sql.crossJoin.enabled", "true")

然后通过传递此SparkConf

val sparkSession = SparkSession.builder().config(sparkConf).getOrCreate()

然后加入您的行列...

Then do your join...

这篇关于如何在Spark 2.0中启用笛卡尔联接?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆