Spark Python 错误“FileNotFoundError: [WinError 2] 系统找不到指定的文件" [英] Spark Python error "FileNotFoundError: [WinError 2] The system cannot find the file specified"
问题描述
我是 Spark 和 Python 的新手.我已经在 windows 上安装了 python 3.5.1 和 Spark-1.6.0-bin-hadoop2.4.
当我从 python shell 执行 sc = SparkContext("local", "Simple App") 时出现以下错误..
你能帮忙吗?
<块引用><块引用><块引用>从 pyspark 导入 SparkConf、SparkContext
sc = SparkContext("local", "Simple App")
回溯(最近一次调用最后一次):
文件",第1行,在
sc = SparkContext("local", "Simple App")
文件C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py",第 112 行,init
SparkContext._ensure_initialized(self, gateway=gateway)
文件C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py",第 245 行,在 _ensure_initialized 中
SparkContext._gateway = gateway 或 launch_gateway()
文件C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\java_gateway.py",第 79 行,launch_gateway 中
proc = Popen(command, stdin=PIPE, env=env)
文件C:\Python35-32\lib\subprocess.py",第 950 行,initrestore_signals, start_new_session)
文件C:\Python35-32\lib\subprocess.py",第 1220 行,在 _execute_child 中启动信息)
FileNotFoundError: [WinError 2] 系统找不到指定的文件
<块引用><块引用><块引用>检查您的地址以确保其书写正确.就我而言,我的地址为:
<块引用>"C:/Users/nekooeimehr/AppData/Local/Programs/Python/Python35-32/spark-1.6.2-bin-hadoop2.4"
而正确的地址是:
<块引用>C:/Users/nekooeimehr/AppData/Local/Programs/Python/Python35-32/spark-1.6.2-bin-hadoop2.4/spark-1.6.2-bin-hadoop2.4"
I am new to Spark and Python. I have installed python 3.5.1 and Spark-1.6.0-bin-hadoop2.4 on windows.
I am getting below error when I execute sc = SparkContext("local", "Simple App") from python shell..
Can you please help?
from pyspark import SparkConf, SparkContext
sc = SparkContext("local", "Simple App")
Traceback (most recent call last):
File "", line 1, in
sc = SparkContext("local", "Simple App")
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py", line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py", line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\java_gateway.py", line 79, in launch_gateway
proc = Popen(command, stdin=PIPE, env=env)
File "C:\Python35-32\lib\subprocess.py", line 950, in init restore_signals, start_new_session)
File "C:\Python35-32\lib\subprocess.py", line 1220, in _execute_child startupinfo)
FileNotFoundError: [WinError 2] The system cannot find the file specified
Check your address to make sure it is written correctly. In my case, I had the address as:
"C:/Users/nekooeimehr/AppData/Local/Programs/Python/Python35-32/spark-1.6.2-bin-hadoop2.4"
while the correct address is:
"C:/Users/nekooeimehr/AppData/Local/Programs/Python/Python35-32/spark-1.6.2-bin-hadoop2.4/spark-1.6.2-bin-hadoop2.4"
这篇关于Spark Python 错误“FileNotFoundError: [WinError 2] 系统找不到指定的文件"的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!