错误SparkContext:初始化SparkContext时出错. java.net.BindException:无法分配请求的地址:服务"sparkDriver"失败 [英] ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

查看:722
本文介绍了错误SparkContext:初始化SparkContext时出错. java.net.BindException:无法分配请求的地址:服务"sparkDriver"失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在以下安装程序中安装了版本: Hadoop版本1.0.3 Java版本"1.7.0_67" Scala版本2.11.7 Spark版本2.1.1.

I have install below setup with version: Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1.

遇到错误,有人可以帮助我吗?

getting below error, can any one help me this.

root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.

17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing 


<console>:14: error: not found: value spark
       import spark.implicits._

<console>:14: error: not found: value spark
       import spark.sql


Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

推荐答案

有几种不同的解决方案

  1. 获取您的主机名

  1. Get your hostname

$ hostname

然后尝试分配您的主机名

then try to assign your host name

$ sudo hostname -s 127.0.0.1

开始spark-shell.

将主机名添加到/etc/hosts文件中(如果不存在)

Add your hostname to your /etc/hosts file (if not present)

127.0.0.1      your_hostname

  • 添加环境变量

  • Add env variable

    export SPARK_LOCAL_IP="127.0.0.1" 
    
    load-spark-env.sh 
    

  • 以上步骤解决了我的问题,但您也可以尝试添加

  • Above steps solved my problem but you can also try to add

    export SPARK_LOCAL_IP=127.0.0.1 
    

    在模板文件spark-env.sh.template(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)上的本地IP注释下

    under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

    然后

    cp spark-env.sh.template spark-env.sh
    spark-shell
    

  • 如果以上均未解决,请检查您的防火墙并启用它(如果尚未启用)

  • If none of the above fixes, check your firewall and enable it, if not already enabled

    这篇关于错误SparkContext:初始化SparkContext时出错. java.net.BindException:无法分配请求的地址:服务"sparkDriver"失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

  • 查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆