设置了SPARK_HOME env变量,但是Jupyter Notebook没有看到它. (视窗) [英] The SPARK_HOME env variable is set but Jupyter Notebook doesn't see it. (Windows)
问题描述
我在Windows 10上.我试图启动Spark并在Jupyter Notebook和Python 3.5中运行.我安装了Spark的预构建版本,并设置了SPARK_HOME环境变量.我安装了findspark并运行代码:
I'm on Windows 10. I was trying to get Spark up and running in a Jupyter Notebook alongside Python 3.5. I installed a pre-built version of Spark and set the SPARK_HOME environmental variable. I installed findspark and run the code:
import findspark
findspark.init()
我收到价值错误:
ValueError:找不到Spark,请确保已设置SPARK_HOME env或Spark在预期的位置(例如,通过自制程序安装).
ValueError: Couldn't find Spark, make sure SPARK_HOME env is set or Spark is in an expected location (e.g. from homebrew installation).
但是设置了SPARK_HOME变量. 此处是一张屏幕截图,显示了我的系统.
However the SPARK_HOME variable is set. Here is a screenshot that shows that the list of environmental variables on my system.
有人遇到此问题或知道如何解决此问题吗?我只发现一个古老的讨论,其中有人将SPARK_HOME设置到错误的文件夹,但我不认为这是我的情况.
Has anyone encountered this issue or would know how to fix this? I only found an old discussion in which someone had set SPARK_HOME to the wrong folder but I don't think it's my case.
推荐答案
我遇到了同样的问题,并通过安装无用的"和虚拟的盒子"解决了. (请注意,尽管我使用的是Mac OS和Python 2.7.11)
I had same problem and had it solved by installing "vagrant" and "virtual box". (Note, though I use Mac OS and Python 2.7.11)
看一下本教程,该教程适用于哈佛CS109课程: https://github.com/cs109/2015lab8/blob/master/installing_vagrant. pdf
Take a look at this tutorial, which is for the Harvard CS109 course : https://github.com/cs109/2015lab8/blob/master/installing_vagrant.pdf
在终端上无用重新加载"后,我可以运行我的代码而不会出现错误. 注意附件图像中显示的命令"os.getcwd"的结果之间的差异.
After "vagrant reload" on the terminal , I am able to run my codes without errors. NOTE the difference between the result of command "os.getcwd" shown in the attached images.
这篇关于设置了SPARK_HOME env变量,但是Jupyter Notebook没有看到它. (视窗)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!