PySpark-系统找不到指定的路径 [英] PySpark - The system cannot find the path specified

查看:73
本文介绍了PySpark-系统找不到指定的路径的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经多次运行Spark(Spyder IDE).今天我遇到了这个错误(代码是一样的)

I have been run Spark multiple times (Spyder IDE). Today I got this error (the code it's the same)

from py4j.java_gateway import JavaGateway
gateway = JavaGateway()

os.environ['SPARK_HOME']="C:/Apache/spark-1.6.0"
os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_71"
sys.path.append("C:/Apache/spark-1.6.0/python/")
os.environ['HADOOP_HOME']="C:/Apache/spark-1.6.0/winutils/"

from pyspark import SparkContext
from pyspark import SparkConf

conf = SparkConf()
    The system cannot find the path specified.
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\Apache\spark-1.6.0\python\pyspark\conf.py", line 104, in __init__
        SparkContext._ensure_initialized()
      File "C:\Apache\spark-1.6.0\python\pyspark\context.py", line 245, in _ensure_initialized
        SparkContext._gateway = gateway or launch_gateway()
      File "C:\Apache\spark-1.6.0\python\pyspark\java_gateway.py", line 94, in launch_gateway
        raise Exception("Java gateway process exited before sending the driver its port number")
    Exception: Java gateway process exited before sending the driver its port number

出了什么问题?谢谢您的时间.

What's go wrong? thanks for your time.

推荐答案

确定...有人在VirtualMachine中安装了新的Java版本.我只是更改此

Ok... Someone install a new java version in VirtualMachine. I'm only change this

os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_91" 

,然后重新工作.在您的时间里.

and works again. thks for your time.

这篇关于PySpark-系统找不到指定的路径的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆