无法连接到远程Apache-Spark [英] Unable to Connect to remote Apache-Spark

查看:102
本文介绍了无法连接到远程Apache-Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是 apache-spark 的新手,在尝试从我的本地计算机到包含Spark工作实例的远程服务器.

I'm new to apache-spark and I'm experiencing some issues while trying to connect from my local machine to a remote server which contains a Spark working instance.

我使用 JSCH 成功地将SSH隧道连接到了该服务器,但是得到了以下内容错误:

I successfully managed to connect vis SSH tunnel to that server using JSCH but I get the following error:

线程"main"中的异常;java.lang.NoSuchMethodError:scala.Predef $.$ scope()Lscala/xml/TopScope $;在org.apache.spark.ui.jobs.AllJobsPage.(AllJobsPage.scala:39)在org.apache.spark.ui.jobs.JobsTab.(JobsTab.scala:38)在org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65)在org.apache.spark.ui.SparkUI.(SparkUI.scala:82)在org.apache.spark.ui.SparkUI $ .create(SparkUI.scala:220)在org.apache.spark.ui.SparkUI $ .createLiveUI(SparkUI.scala:162)在org.apache.spark.SparkContext.(SparkContext.scala:452)在server.Server $ .main(Server.scala:45)在server.Server.main(Server.scala)

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; at org.apache.spark.ui.jobs.AllJobsPage.(AllJobsPage.scala:39) at org.apache.spark.ui.jobs.JobsTab.(JobsTab.scala:38) at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65) at org.apache.spark.ui.SparkUI.(SparkUI.scala:82) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:220) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:162) at org.apache.spark.SparkContext.(SparkContext.scala:452) at server.Server$.main(Server.scala:45) at server.Server.main(Server.scala)

尝试连接Spark时.

When trying to connect to Spark.

这是我的scala代码

This is my scala code

val conf = new SparkConf().setAppName("Test").setMaster("spark://xx.xxx.xxx.x:7077")
val sc = new SparkContext(conf)
val rdd = sc.parallelize(Array(1, 2, 3, 4, 5)).count()
println(rdd)

错误中在(Server.scala:45)中突出显示的 45 行是带有 new SparkContext(conf)的错误行.

Where line 45 highlighted at (Server.scala:45) in the error is the one with new SparkContext(conf).

在本地和远程计算机上,我都使用 scala〜2.11.6 .我在本地的 pom.xml 文件中导入了 scala:2.11.6 spark-core_2.10 spark-sql_2.10都是〜2.1.1 .在我的服务器上,我安装了 spark〜2.1.1 .在服务器上,我还通过编辑 conf/spark-env.sh master 设置为本地计算机.

Both on local and remote machine I'm using scala ~ 2.11.6. On my local pom.xml file I imported scala : 2.11.6, spark-core_2.10 and spark-sql_2.10 both ~2.1.1. On my server I installed spark ~ 2.1.1. ON the server I also managed to setup the master as the local machine by editing conf/spark-env.sh.

当然,我设法测试了服务器的火花,并且工作正常.

Of course, I managed to test server's spark and It works just fine.

我在做什么错了?

推荐答案

来自setMaster的文档:

from the docs of setMaster:

要连接的主URL,例如"local"以通过一个线程在本地运行,"local [4]"在本地运行,具有4个内核,或"spark://master:7077"以在Spark独立群集上运行.

The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.

如果从spark集群运行它(据我所知),则应使用 local [n]

If you run it from the spark cluster (as I understand you are), you should use local[n]

这篇关于无法连接到远程Apache-Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆