独立集群上的spark-submit抱怨scala-2.10 jar不存在 [英] spark-submit on standalone cluster complain about scala-2.10 jars not exist

查看:79
本文介绍了独立集群上的spark-submit抱怨scala-2.10 jar不存在的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Spark的新手,并从Apache(Spark-2.1.0-bin-hadoop2.7)下载了预编译的Spark二进制文件

I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)

提交我的scala(2.11.8)超级jar时,群集抛出和错误:

When submitting my scala (2.11.8) uber jar the cluster throw and error:

java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built

我没有运行Scala 2.10,Scala 2.10也没有编译Spark(据我所知)

I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10

难道我的依赖项之一是基于Scala 2.10的吗?
有什么建议可能会出错吗?

Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?

推荐答案

尝试在系统或IDE上设置SPARK_HOME =您的spark安装的位置"

Try setting SPARK_HOME="location to your spark installation" on your system or IDE

这篇关于独立集群上的spark-submit抱怨scala-2.10 jar不存在的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆