apache zeppelin中找不到pyspark解释器 [英] pyspark interpreter not found in apache zeppelin

查看:348
本文介绍了apache zeppelin中找不到pyspark解释器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Apache-Zeppelin(0.6.0版)笔记本中使用pyspark时遇到问题.运行以下简单代码会给我pyspark interpreter not found错误

I am having issue with using pyspark in Apache-Zeppelin (version 0.6.0) notebook. Running the following simple code gives me pyspark interpreter not found error

%pyspark
a = 1+3

运行sc.version给了我res2: String = 1.6.0,这是我的机器上安装的spark的版本.然后运行z返回res0: org.apache.zeppelin.spark.ZeppelinContext = {}

Running sc.version gave me res2: String = 1.6.0 which is the version of spark installed on my machine. And running z return res0: org.apache.zeppelin.spark.ZeppelinContext = {}

  1. Pyspark可通过CLI运行(使用spark 1.6.0和python 2.6.6)

  1. Pyspark works from CLI (using spark 1.6.0 and python 2.6.6)

计算机2.6.6上的默认python,同时也安装了anaconda-python 3.5,但未将其设置为默认python.

The default python on the machine 2.6.6, while anaconda-python 3.5 is also installed but not set as default python.

基于此

Based on this post I updated the zeppelin-env.sh file located at /usr/hdp/current/zeppelin-server/lib/conf and added Anaconda python 3 path

export PYSPARK_PYTHON=/opt/anaconda3/bin/python
export PYTHONPATH=/opt/anaconda3/bin/python

此后,我多次使用

/usr/hdp/current/zeppelin-server/lib/bin/zeppelin-daemon.sh

但是我无法让pyspark解释器在齐柏林飞艇中工作.

But I can't get the pyspark interpreter to work in zeppelin.

推荐答案

对于发现pyspark没有响应的人,请尝试重新启动Zeppelin中的spark解释器,这可能会解决pyspark无法响应 错误.

To people who found out pyspark not responding, please try to restart your spark interpreter in Zeppelin,it may solve pyspark not responding error.

这篇关于apache zeppelin中找不到pyspark解释器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆