在Windows 10或7的Apache Zeppelin中运行spark程序时出错 [英] Getting error while running spark programs in Apache Zeppelin in Windows 10 or 7

查看:219
本文介绍了在Windows 10或7的Apache Zeppelin中运行spark程序时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Apache Zeppelin的新手.已安装0.8.0,并使用7000端口访问Zeppelin.如下所述配置了几个路径.

I am new to Apache Zeppelin. Installed 0.8.0 and using 7000 port to access Zeppelin. Configured few paths as mentioned below.

JAVA_HOME:C:\ Program Files \ Java \ jdk1.8.0_144 HADOOP_HOME:C:\ winutils ZEPPELIN_HOME:C:\ zeppelin \ zeppelin-0.8.0-bin-all \ zeppelin-0.8.0-bin-all

JAVA_HOME: C:\Program Files\Java\jdk1.8.0_144 HADOOP_HOME: C:\winutils ZEPPELIN_HOME: C:\zeppelin\zeppelin-0.8.0-bin-all\zeppelin-0.8.0-bin-all

所有这些变量都包含在路径变量和相应的bin文件夹中

all these variables are included in path variable and respective bin folders

试图运行spark程序并出现以下错误.尝试使用多个选项进行修复,但无法.请帮忙.

Tried to run spark program and getting the below error. Tried with multiple options to fix it, but unable. Please help.

火花程序: %火花 println(sc.appName)

Spark Program: %spark println(sc.appName)

错误: 调试[2018-07-29 00:06:05,371]({pool-2-thread-2} RemoteInterpreterManagedProcess.java [start]:153)-callbackServer现在正在提供 INFO [2018-07-29 00:06:05,380]({pool-2-thread-2} RemoteInterpreterManagedProcess.java [start]:190)-运行解释器进程[C:\ zeppelin \ zeppelin-0.8.0-bin- all \ zeppelin-0.8.0-bin-all \ bin \ interpreter.cmd,-d,C:\ zeppelin \ zeppelin-0.8.0-bin-all \ zeppelin-0.8.0-bin-all/interpreter/spark, -c,10.120.44.23,-p,57136,-r,:,-l,C:\ zeppelin \ zeppelin-0.8.0-bin-all \ zeppelin-0.8.0-bin-all/local-repo/spark ,-g,火花] 调试[2018-07-29 00:06:09,625]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-警告:本地jar C:\ Users \ vvellabo \ 57136不存在,正在跳过. 调试[2018-07-29 00:06:09,626]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-警告:本地jar C:\ Users \ vvellabo \ 10.120.44.23不存在,正在跳过. 调试[2018-07-29 00:06:09,627]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-java.lang.ClassNotFoundException:org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 调试[2018-07-29 00:06:09,628]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在java.net.URLClassLoader.findClass(URLClassLoader.java:381) 调试[2018-07-29 00:06:09,630]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在java.lang.ClassLoader.loadClass(ClassLoader.java:424) 调试[2018-07-29 00:06:09,632]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在java.lang.ClassLoader.loadClass(ClassLoader.java:357) 调试[2018-07-29 00:06:09,633]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在java.lang.Class.forName0(本机方法) 调试[2018-07-29 00:06:09,634]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在java.lang.Class.forName(Class.java:348) 调试[2018-07-29 00:06:09,635]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.util.Utils $ .classForName(Utils.scala:238) 调试[2018-07-29 00:06:09,641]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:851) 调试[2018-07-29 00:06:09,642]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:198) 调试[2018-07-29 00:06:09,644]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:228) 调试[2018-07-29 00:06:09,645]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:137) 调试[2018-07-29 00:06:09,647]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) DEBUG [2018-07-29 00:06:09,656]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-2018-07-29 00:06:09 INFO ShutdownHookManager:54-调用了关闭挂钩 调试[2018-07-29 00:06:09,659]({Exec Stream Pumper} RemoteInterpreterManagedProcess.java [processLine]:298)-2018-07-29 00:06:09 INFO ShutdownHookManager:54-删除目录C:\ Users \ vvellabo \ AppData \ Local \ Temp \ spark-427c3202-c243-4761-86ce-ea51a27a881c INFO [2018-07-29 00:06:09,747]({Exec Default Executor} RemoteInterpreterManagedProcess.java [onProcessFailed]:250)-解释程序失败{} org.apache.commons.exec.ExecuteException:进程退出,错误为101(退出值:101) 在org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404) 在org.apache.commons.exec.DefaultExecutor.access $ 200(DefaultExecutor.java:48) 在org.apache.commons.exec.DefaultExecutor $ 1.run(DefaultExecutor.java:200) 在java.lang.Thread.run(Thread.java:748) 错误[2018-07-29 00:07:05,382]({pool-2-thread-2} Job.java [run]:190)-作业失败 java.lang.RuntimeException:警告:本地jar C:\ Users \ vvellabo \ 57136不存在,正在跳过. 警告:本地jar C:\ Users \ vvellabo \ 10.120.44.23不存在,正在跳过.

Error: DEBUG [2018-07-29 00:06:05,371] ({pool-2-thread-2} RemoteInterpreterManagedProcess.java[start]:153) - callbackServer is serving now INFO [2018-07-29 00:06:05,380] ({pool-2-thread-2} RemoteInterpreterManagedProcess.java[start]:190) - Run interpreter process [C:\zeppelin\zeppelin-0.8.0-bin-all\zeppelin-0.8.0-bin-all\bin\interpreter.cmd, -d, C:\zeppelin\zeppelin-0.8.0-bin-all\zeppelin-0.8.0-bin-all/interpreter/spark, -c, 10.120.44.23, -p, 57136, -r, :, -l, C:\zeppelin\zeppelin-0.8.0-bin-all\zeppelin-0.8.0-bin-all/local-repo/spark, -g, spark] DEBUG [2018-07-29 00:06:09,625] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - Warning: Local jar C:\Users\vvellabo\57136 does not exist, skipping. DEBUG [2018-07-29 00:06:09,626] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - Warning: Local jar C:\Users\vvellabo\10.120.44.23 does not exist, skipping. DEBUG [2018-07-29 00:06:09,627] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - java.lang.ClassNotFoundException: org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer DEBUG [2018-07-29 00:06:09,628] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at java.net.URLClassLoader.findClass(URLClassLoader.java:381) DEBUG [2018-07-29 00:06:09,630] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at java.lang.ClassLoader.loadClass(ClassLoader.java:424) DEBUG [2018-07-29 00:06:09,632] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at java.lang.ClassLoader.loadClass(ClassLoader.java:357) DEBUG [2018-07-29 00:06:09,633] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at java.lang.Class.forName0(Native Method) DEBUG [2018-07-29 00:06:09,634] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at java.lang.Class.forName(Class.java:348) DEBUG [2018-07-29 00:06:09,635] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.util.Utils$.classForName(Utils.scala:238) DEBUG [2018-07-29 00:06:09,641] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851) DEBUG [2018-07-29 00:06:09,642] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) DEBUG [2018-07-29 00:06:09,644] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) DEBUG [2018-07-29 00:06:09,645] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) DEBUG [2018-07-29 00:06:09,647] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) DEBUG [2018-07-29 00:06:09,656] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - 2018-07-29 00:06:09 INFO ShutdownHookManager:54 - Shutdown hook called DEBUG [2018-07-29 00:06:09,659] ({Exec Stream Pumper} RemoteInterpreterManagedProcess.java[processLine]:298) - 2018-07-29 00:06:09 INFO ShutdownHookManager:54 - Deleting directory C:\Users\vvellabo\AppData\Local\Temp\spark-427c3202-c243-4761-86ce-ea51a27a881c INFO [2018-07-29 00:06:09,747] ({Exec Default Executor} RemoteInterpreterManagedProcess.java[onProcessFailed]:250) - Interpreter process failed {} org.apache.commons.exec.ExecuteException: Process exited with an error: 101 (Exit value: 101) at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404) at org.apache.commons.exec.DefaultExecutor.access$200(DefaultExecutor.java:48) at org.apache.commons.exec.DefaultExecutor$1.run(DefaultExecutor.java:200) at java.lang.Thread.run(Thread.java:748) ERROR [2018-07-29 00:07:05,382] ({pool-2-thread-2} Job.java[run]:190) - Job failed java.lang.RuntimeException: Warning: Local jar C:\Users\vvellabo\57136 does not exist, skipping. Warning: Local jar C:\Users\vvellabo\10.120.44.23 does not exist, skipping.

谢谢 vvell

推荐答案

我遇到了类似的问题.如果有人解决了这个问题,请您继续关注此问题.

此链接将为您提供帮助!

I faced a similar issue. Kept coming back to this question in case someone solves it.

This link will help you!

如果您不想执行这些步骤,只需删除 SPARK_HOME 环境变量.齐柏林飞艇拥有自己的火花罐库. 接下来去

If you do not wish to go through the steps, simply delete the SPARK_HOME environment variable. Zeppelin has it's own library of spark jars. Next go to

%Zeppelin_HOME%\ conf \

%Zeppelin_HOME%\conf\

并重命名

zeppelin-env.cmd.template 到 zeppelin-env.cmd

zeppelin-env.cmd.template to zeppelin-env.cmd

并添加以下几行:

set JAVA="C:\Program Files\Java\jdk1.8.0_181"
set JAVA_HOME="%JAVA%"

确保输入正确的jdk路径. 保存并使用命令bin\zeppelin.cmd启动zepplin
您准备好使用Spark的齐柏林飞艇了!通过运行任何简单的代码进行确认:例如,sc.version

希望这会有所帮助!

Make sure you enter the correct path to your jdk. Save it and start zepplin with the command bin\zeppelin.cmd
Your zeppelin with Spark is ready! Confirm by running any simple code: For example, sc.version

Hope this helps!

这篇关于在Windows 10或7的Apache Zeppelin中运行spark程序时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆