为什么起动火花壳失败,"我们找不到任何外部IP地址]按钮!;在Windows? [英] Why does starting spark-shell fail with "we couldn't find any external IP address!" on Windows?

查看:148
本文介绍了为什么起动火花壳失败,"我们找不到任何外部IP地址]按钮!;在Windows?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有现在在我的Windows计算机上启动火花外壳的麻烦。我使用的火花版本是1.5.2 pre-内置Hadoop的2.4或更高版本。我觉得火花shell.cmd可以直接无需任何配置,因为它是pre-建成运行,我无法弄清楚什么是prevents我无法正常启动星火问题。

除了错误信息打印出来,我还可以在命令行中执行一些基本的命令阶,但显然事情错在这里。

下面是CMD错误日志:

 的log4j:警告没有附加目的地可以为记录器上找到(org.apache.hadoop.metrics2.li
b.MutableMetricsFactory)。
log4j的:WARN请正确初始化log4j的系统。
log4j的:警告见http://logging.apache.org/log4j/1.2/faq.html#noconfig进行更
FO。
使用星火的REPL log4j的配置文件:组织/阿帕奇/火花/ log4j的 - 默认值,repl.propertie
小号
要调整日志记录级别使用sc.setLogLevel(INFO)
欢迎来到
      ____ __
     / __ / __ ___ _____ / / __
    _ \\ \\ / _ \\ / _`/ __ /'_ /
   / ___ / .__ / \\ _,_ / _ / / _ / \\ _ \\ 1.5.2版
      / _ /使用Scala版本2.10.4(Java的热点(TM)64位服务器VM,爪哇1.8.0_25)
在EX pressions类型,让他们评估。
类型:帮助更多的信息。
15/11/18十七点51分32秒WARN MetricsSystem:使用默认名称DAGScheduler源
 因为spark.app.id未设置。
作为SC星火上下文。
15/11/18 17时51分39秒WARN一般:插件(包)org.datanucleus已经是章
istered。确保你没有在CL同一插件的多个版本的JAR
asspath。该URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus-core-3.2.10
的.jar已经注册了,你尝试注册相同的插件
位于URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/bin /../ LIB / DataNucleus的核心-3
.2.10.jar。
15/11/18 17时51分39秒WARN一般:插件(包)org.datanucleus.store.rdbms的
 已经注册。确保你不要有相同的插件的多个版本的JAR
在类路径中。该URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus
-rdbms-3.2.9.jar已经注册了,你尝试注册的ident
iCal的插件位于URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/bin /../ LIB / datanu
cleus的RDBMS-3.2.9.jar。
15/11/18 17时51分39秒WARN一般:插件(包)org.datanucleus.api.jdo是ALR
伊迪注册。确保你不要有相同的插件我的多个版本的JAR
n中的类路径。该URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/bin /../ LIB / datanucl
EUS-API JDO-3.2.6.jar已被注册,并且您要注册一个
相同的插件位于URL的file:/ C:/spark-1.5.2-bin-hadoop2.4/lib/datanucl
EUS-API JDO-3.2.6.jar。
15/11/18 17时51分39秒WARN连接:BoneCP规定,但在CLASSPATH没有present
 (或一个依赖项)
15/11/18十七点51分40秒WARN连接:BoneCP规定,但在CLASSPATH没有present
 (或一个依赖项)
15/11/18 17时51分46秒WARN ObjectStore的:版本信息metastore没有找到。
hive.metastore.schema.verification没有启用,因此录制模式versio
ñ1.2.0
15/11/18 17时51分46秒WARN的ObjectStore:无法获取数据库默认情况下,没有返回
SuchObjectException
15/11/18 17时51分47秒警告:您的主机名,联想PC解析为回送/非reachab
乐地址:FE80:0:0:0:297A:e76d:828:59dc%WLAN2,但我们找不到任何的extern
人的IP地址!
了java.lang.RuntimeException:显示java.lang.NullPointerException
        在org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
答:522)
        在org.apache.spark.sql.hive.client.ClientWrapper<&初始化GT;(ClientWrapper.s
卡拉:171)
        在org.apache.spark.sql.hive.HiveContext.executionHive $ lzycompute(HiveCo
ntext.scala:162)
        在org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala
:160)
        在org.apache.spark.sql.hive.HiveContext<初始化方式>(HiveContext.scala:167)
        在sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)        在sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:62)
        在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
onstructorAccessorImpl.java:45)
        在java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        在org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:10
28)
        在IWC万国表$ $$ IWC万国表<初始化方式>(小于控制台> 9)
        在IWC万国表$<&初始化GT;(小于控制台>:18)。
        在与下;初始化>(小于控制台>:20)
        在与下;初始化>(小于控制台>:24)。
        在与下; clinit>(小于控制台&GT)
        在与下;初始化方式>(小于控制台>:7)
        在与下; clinit>(小于控制台&GT)
        在$打印(小于控制台>)
        在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
        在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl。
Java的:62)
        在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        在java.lang.reflect.Method.invoke(Method.java:483)
        在org.apache.spark.repl.SparkIMain $ ReadEvalPrint.call(SparkIMain.scala:
1065)
        在org.apache.spark.repl.SparkIMain $ Request.loadAndRun(SparkIMain.scala:
1340)
        在org.apache.spark.repl.SparkIMain.loadAndRunReq $ 1(SparkIMain.scala:840

        在org.apache.spark.repl.SparkIMain.inter preT(SparkIMain.scala:871)
        在org.apache.spark.repl.SparkIMain.inter preT(SparkIMain.scala:819)
        在org.apache.spark.repl.SparkILoop.reallyInter preT $ 1(SparkILoop.scala:8
57)
        在org.apache.spark.repl.SparkILoop.inter pretStartingWith(SparkILoop.sca
LA:902)
        在org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        在org.apache.spark.repl.SparkILoopInit $$ anonfun $ initializeSpark $ 1.适用
(SparkILoopInit.scala:132)
        在org.apache.spark.repl.SparkILoopInit $$ anonfun $ initializeSpark $ 1.适用
(SparkILoopInit.scala:124)
        在org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
        在org.apache.spark.repl.SparkILoopInit $ class.initializeSpark(SparkILoop
Init.scala:124)
        在org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)        在org.apache.spark.repl.SparkILoop $$ anonfun $ $组织阿帕奇$火花$ REPL $星火
ILOOP $$过程$ 1 $$ anonfun $ $应用$ MCZ SP $ 5.apply $ MCV $ SP(SparkILoop.scala:974)
        在org.apache.spark.repl.SparkILoopInit $ class.runThunks(SparkILoopInit.s
卡拉:159)
        在org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
        在org.apache.spark.repl.SparkILoopInit $ class.postInitialization(SparkIL
oopInit.scala:108)
        在org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:
64)
        在org.apache.spark.repl.SparkILoop $$ anonfun $ $组织阿帕奇$火花$ REPL $星火
ILOOP $$过程$ 1.适用$ MCZ $ SP(SparkILoop.scala:991)
        在org.apache.spark.repl.SparkILoop $$ anonfun $ $组织阿帕奇$火花$ REPL $星火
ILOOP $$过程$ 1.适用(SparkILoop.scala:945)
        在org.apache.spark.repl.SparkILoop $$ anonfun $ $组织阿帕奇$火花$ REPL $星火
ILOOP $$过程$ 1.适用(SparkILoop.scala:945)
        在scala.tools.nsc.util.ScalaClassLoader $ .savingContextLoader(ScalaClass
Loader.scala:135)
        在org.apache.spark.repl.SparkILoop.org $阿帕奇$火花$ REPL $ SparkILoop $$公关
ocess(SparkILoop.scala:945)
        在org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        在org.apache.spark.repl.Main $。主要(Main.scala:31)
        在org.apache.spark.repl.Main.main(Main.scala)
        在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
        在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl。
Java的:62)
        在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        在java.lang.reflect.Method.invoke(Method.java:483)
        在org.apache.spark.deploy.SparkSubmit $ .ORG $阿帕奇$火花$ $部署SparkSub
麻省理工学院$$ runMain(SparkSubmit.scala:674)
        在org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:18
0)
        在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:205)
        在org.apache.spark.deploy.SparkSubmit $。主要(SparkSubmit.scala:120)
        在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
显示java.lang.NullPointerException:产生的原因
        在java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
        在org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
        在org.apache.hadoop.util.Shell.run(Shell.java:418)
        在org.apache.hadoop.util.Shell $ ShellCommandExecutor.execute(Shell.java:
650)
        在org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
        在org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
        在org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
        在org.apache.hadoop.fs.RawLocalFileSystem $德precatedRawLocalFileStatus。
loadPermissionInfo(RawLocalFileSystem.java:559)
        在org.apache.hadoop.fs.RawLocalFileSystem $德precatedRawLocalFileStatus。
getPermission(RawLocalFileSystem.java:534)
        在org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SESS
ionState.java:599)
        在org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SESS
ionState.java:554)
        在org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
答:508)
        ... 56更多<&控制台GT;:10:错误:未找到:值sqlContext
       进口sqlContext.implicits._
              ^
<&控制台GT;:10:错误:未找到:值sqlContext
       进口sqlContext.sql
              ^


解决方案

有几个问题。你在Windows和东西都放在这个操作系统相比其他POSIX兼容的操作系统不同。

阅读问题在Windows 文件运行的Hadoop,看看是否失踪WINUTILS.EXE是问题的开始。请确保您运行火花壳在控制台管理员权限。

您可能还需要阅读的答案了类似的问题为什么起动火花壳失败,NullPointerException异常在Windows?

此外,您可能已经开始火花壳子目录,因此像错误:


  

15/11/18 17点51分39秒WARN一般:插件(包)
  org.datanucleus.api.jdo已注册。确保你没有
  在classpath中相同的插件的多个版本的jar。该网址
  文件:/ C:/spark-1.5.2-bin-hadoop2.4/bin /../ LIB / DataNucleus的-API JDO-3.2.6.jar
  已经注册了,你尝试注册相同
  插件位于网址
  文件:/ C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar


和最后一个问题:


  

15/11/18 17点51分47秒警告:您的主机名,联想PC解析为一个
  回环/不可达地址:FE80:0:0:0:297A:e76d:828:59dc%WLAN2,
  但我们找不到任何外部IP地址!


一个的解决方法的是设置 SPARK_LOCAL_HOSTNAME 来解析某些主机名,用它做。


  • SPARK_LOCAL_HOSTNAME 是创建驱动程序,师傅,工人和执行人时覆盖所有其他候选人主机定制的主机名。

在你的情况下,使用火花壳,只需执行以下命令:

  SPARK_LOCAL_HOSTNAME =本地主机./bin/spark-shell

您也可以使用:

  ./斌/火花壳-c spark.driver.host =本地主机

也可以参考环境变量星火的官方文档中的。

I am having trouble in starting spark-shell on my Windows computer now. The version of Spark I am using is 1.5.2 pre-built for Hadoop 2.4 or later. I think spark-shell.cmd could be run directly without any configuration since it is pre-built and I cannot figure out what is the problem that prevents me from starting Spark correctly.

Aside from the error message printed out I can still execute some basic scala command on the command line, but apparently something is going wrong here.

Here is the error log from cmd:

   log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.li
b.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more in
fo.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.propertie
s
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25)
Type in expressions to have them evaluated.
Type :help for more information.
15/11/18 17:51:32 WARN MetricsSystem: Using default name DAGScheduler for source
 because spark.app.id is not set.
Spark context available as sc.
15/11/18 17:51:39 WARN General: Plugin (Bundle) "org.datanucleus" is already reg
istered. Ensure you dont have multiple JAR versions of the same plugin in the cl
asspath. The URL "file:/C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus-core-3.2.10
.jar" is already registered, and you are trying to register an identical plugin
located at URL "file:/C:/spark-1.5.2-bin-hadoop2.4/bin/../lib/datanucleus-core-3
.2.10.jar."
15/11/18 17:51:39 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is
 already registered. Ensure you dont have multiple JAR versions of the same plug
in in the classpath. The URL "file:/C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus
-rdbms-3.2.9.jar" is already registered, and you are trying to register an ident
ical plugin located at URL "file:/C:/spark-1.5.2-bin-hadoop2.4/bin/../lib/datanu
cleus-rdbms-3.2.9.jar."
15/11/18 17:51:39 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is alr
eady registered. Ensure you dont have multiple JAR versions of the same plugin i
n the classpath. The URL "file:/C:/spark-1.5.2-bin-hadoop2.4/bin/../lib/datanucl
eus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an
identical plugin located at URL "file:/C:/spark-1.5.2-bin-hadoop2.4/lib/datanucl
eus-api-jdo-3.2.6.jar."
15/11/18 17:51:39 WARN Connection: BoneCP specified but not present in CLASSPATH
 (or one of dependencies)
15/11/18 17:51:40 WARN Connection: BoneCP specified but not present in CLASSPATH
 (or one of dependencies)
15/11/18 17:51:46 WARN ObjectStore: Version information not found in metastore.
hive.metastore.schema.verification is not enabled so recording the schema versio
n 1.2.0
15/11/18 17:51:46 WARN ObjectStore: Failed to get database default, returning No
SuchObjectException
15/11/18 17:51:47 WARN : Your hostname, Lenovo-PC resolves to a loopback/non-reachab
le address: fe80:0:0:0:297a:e76d:828:59dc%wlan2, but we couldn't find any extern
al IP address!
java.lang.RuntimeException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
a:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.s
cala:171)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveCo
ntext.scala:162)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala
:160)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
onstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:10
28)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:
1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:
1340)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840
)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8
57)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca
la:902)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply
(SparkILoopInit.scala:132)
        at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply
(SparkILoopInit.scala:124)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
        at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop
Init.scala:124)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
        at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s
cala:159)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
        at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL
oopInit.scala:108)
        at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:
64)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
ILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
ILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass
Loader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr
ocess(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub
mit$$runMain(SparkSubmit.scala:674)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18
0)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
        at org.apache.hadoop.util.Shell.run(Shell.java:418)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
650)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
        at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
        at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
loadPermissionInfo(RawLocalFileSystem.java:559)
        at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.
getPermission(RawLocalFileSystem.java:534)
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(Sess
ionState.java:599)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(Sess
ionState.java:554)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
a:508)
        ... 56 more

<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

解决方案

There are a couple of issues. You're on Windows and things are different on this OS comparing to other POSIX-compliant OSes.

Start by reading Problems running Hadoop on Windows document and see if "missing WINUTILS.EXE" is the issue. Make sure you run spark-shell in console with admin rights.

You may also want to read the answers to a similar question Why does starting spark-shell fail with NullPointerException on Windows?

Also, you may have started spark-shell inside bin subdirectory and hence the errors like:

15/11/18 17:51:39 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/spark-1.5.2-bin-hadoop2.4/bin/../lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/spark-1.5.2-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar."

And the last issue:

15/11/18 17:51:47 WARN : Your hostname, Lenovo-PC resolves to a loopback/non-reachable address: fe80:0:0:0:297a:e76d:828:59dc%wlan2, but we couldn't find any external IP address!

One workaround is to set SPARK_LOCAL_HOSTNAME to some resolvable host name and be done with it.

  • SPARK_LOCAL_HOSTNAME is the custom host name that overrides any other candidates for hostname when driver, master, workers, and executors are created.

In your case, using spark-shell, just execute the following:

SPARK_LOCAL_HOSTNAME=localhost ./bin/spark-shell

You can also use:

./bin/spark-shell -c spark.driver.host=localhost

Refer also to Environment Variables in the official documentation of Spark.

这篇关于为什么起动火花壳失败,&QUOT;我们找不到任何外部IP地址]按钮!;在Windows?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆