无法在 virtualenv 中加载 pyspark [英] Unable to load pyspark inside virtualenv

查看:51
本文介绍了无法在 virtualenv 中加载 pyspark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 python virtualenv 中安装了 pyspark.我还安装了新发布的 jupyterlab http://jupyterlab.readthedocs.io/en/stable/getting_started/installation.html 在 vi​​rtualenv 中.我无法在 jupyter-notebook 中使用 SparkContext 变量来触发 pyspark.

I had installed pyspark in a python virtualenv. I have also installed jupyterlab which was newly released http://jupyterlab.readthedocs.io/en/stable/getting_started/installation.html in the virtualenv. I was unable to fire pyspark within a jupyter-notebook in such a way that I have the SparkContext variable available.

推荐答案

首先启动 virtualenv

First fire the virtualenv

source venv/bin/activate
export SPARK_HOME={path_to_venv}/lib/python2.7/site-packages/pyspark
export PYSPARK_DRIVER_PYTHON=jupyter-lab

在此之前,我希望您已经完成:pip install pysparkpip install jupyterlab 在您的 virtualenv 中

Before this I hope you have done:pip install pyspark and pip install jupyterlab inside your virtualenv

要检查,一旦您的 jupyterlab 打开,请在 jupyterlab 的框中键入 sc,您应该有可用的 SparkContext 对象并且输出应该是这样的:

To check, once your jupyterlab is open, type sc in a box in the jupyterlab and you should have the SparkContext object available and the output should be this:

SparkContext
Spark UI
Version
v2.2.1
Master
local[*]
AppName
PySparkShell

这篇关于无法在 virtualenv 中加载 pyspark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆