在Zeppelin 0.7.1中运行Spark代码时获取NullPointerException [英] Getting NullPointerException when running Spark Code in Zeppelin 0.7.1

查看:314
本文介绍了在Zeppelin 0.7.1中运行Spark代码时获取NullPointerException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经安装了Zeppelin 0.7.1.当我尝试执行Example Spark程序(在Zeppelin Tutorial笔记本电脑中可用)时,出现以下错误

I have installed Zeppelin 0.7.1. When I tried to execute the Example spark program(which was available with Zeppelin Tutorial notebook), I am getting the following error

java.lang.NullPointerException
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
    at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:391)
    at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:380)
    at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
    at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:828)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

我还设置了配置文件(zeppelin-env.sh)以指向我的Spark安装& Hadoop配置目录

I have also setup the config file(zeppelin-env.sh) to point to my Spark installation & Hadoop configuration directory

export SPARK_HOME="/${homedir}/sk"
export HADOOP_CONF_DIR="/${homedir}/hp/etc/hadoop"

我正在使用的Spark版本是2.1.0& ;; Hadoop是2.7.3

The Spark version I am using is 2.1.0 & Hadoop is 2.7.3

我也使用默认的Spark解释器配置(因此Spark设置为在Local mode中运行)

Also I am using the default Spark Interpreter Configuration(so Spark is set to run in Local mode)

我在这里想念东西吗?

PS:我可以使用spark-shell

推荐答案

现在我已经为Zeppelin-0.7.2解决了此问题:

Just now I got solution of this issue for Zeppelin-0.7.2:

根本原因是:Spark试图设置Hive上下文,但是hdfs服务 没有运行,这就是为什么HiveContext变为null并抛出null的原因 指针异常.

Root Cause is : Spark trying to setup Hive context, but hdfs services is not running, that's why HiveContext become null and throwing null pointer exception.

解决方案:
1.设置Saprk Home [可选]和HDFS.
2.运行HDFS服务
3.重新启动Zeppelin服务器

1.转到Zeppelin的解释器"设置.
2.选择Spark解释器
3. zeppelin.spark.useHiveContext = false

Solution:
1. Setup Saprk Home [optional] and HDFS.
2. Run HDFS service
3. Restart zeppelin server
OR
1. Go to Zeppelin's Interpreter settings.
2. Select Spark Interpreter
3. zeppelin.spark.useHiveContext = false

这篇关于在Zeppelin 0.7.1中运行Spark代码时获取NullPointerException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆