pyspark使用什么路径? [英] What path do I use for pyspark?
问题描述
我已经安装了Spark.并且,我可以进入我的Spark版本中的bin
文件夹,然后运行./spark-shell
,它可以正确运行.
I have spark installed. And, I can go into the bin
folder within my spark version, and run ./spark-shell
and it runs correctly.
但是,由于某些原因,我无法启动pyspark
和任何子模块.
But, for some reason, I am unable to launch pyspark
and any of the submodules.
因此,我进入bin
并启动./pyspark
,它告诉我我的路径不正确.
So, I go into bin
and launch ./pyspark
and it tells me that my path is incorrect.
PYSPARK_PYTHON
的当前路径与从中运行pyspark
可执行脚本的位置相同.
The current path I have for PYSPARK_PYTHON
is the same as where I'm running the pyspark
executable script from.
PYSPARK_PYTHON
的正确路径是什么?难道不是在Spark版本的bin
文件夹中通向名为pyspark
的可执行脚本的路径吗?
What is the correct path for PYSPARK_PYTHON
? Shouldn't it be the path that leads to the executable script called pyspark
in the bin
folder of the spark version?
这就是我现在所走的路,但它告诉了我env: <full PYSPARK_PYTHON path> no such file or directory
.谢谢.
That's the path that I have now, but it tells me env: <full PYSPARK_PYTHON path> no such file or directory
. Thanks.
推荐答案
PYSPARK_PYTHON的正确路径是什么?难道这不是在Spark版本的bin文件夹中通往名为pyspark的可执行脚本的路径吗?
What is the correct path for PYSPARK_PYTHON? Shouldn't it be the path that leads to the executable script called pyspark in the bin folder of the spark version?
不,不应该.它应该指向您要与Spark一起使用的Python可执行文件(例如which python
的输出.如果您不想使用自定义解释器,则忽略它.Spark将使用系统PATH
上可用的第一个Python解释器)
No, it shouldn't. It should point to a Python executable you want to use with Spark (for example output from which python
. If you don't want to use custom interpreter just ignore it. Spark will use the first Python interpreter available on your system PATH
.
这篇关于pyspark使用什么路径?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!