在Windows上从Jupyter笔记本启动PySpark时出现错误消息 [英] Error message when launching PySpark from Jupyter notebook on Windows

查看:374
本文介绍了在Windows上从Jupyter笔记本启动PySpark时出现错误消息的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在Jupyter上运行Apache Spark的相同方法曾经起作用,但现在它抛出异常:在发送驱动程序端口号之前退出Java网关进程

This same approach to run Apache spark on Jupyter used to work, but now it is throwing Exception: Java gateway process exited before sending the driver its port number

这是Jupyter笔记本中以前使用的配置.

Here is the configuration in Jupyter notebook which was working previously.

import os
import sys

spark_home = os.environ.get('SPARK_HOME', None)
print(spark_home)
spark_home= spark_home+"/python"
sys.path.insert(0, spark_home)
sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-  src.zip'))

filename = os.path.join(spark_home, 'pyspark/shell.py')
print(filename)
exec(compile(open(filename, "rb").read(), filename, 'exec'))

spark_release_file = spark_home + "/RELEASE"

if os.path.exists(spark_release_file) and "Spark 1.5" in   open(spark_release_file).read():
pyspark_submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS", "")
if not "pyspark-shell" in pyspark_submit_args: 
    pyspark_submit_args += " pyspark-shell"
    os.environ["PYSPARK_SUBMIT_ARGS"] = pyspark_submit_args

exec语句引发异常.

exec statement is throwing the exception.

请让我知道我做错了.

推荐答案

您需要在if语句中调用execute语句

You need to invoke execute statement in your if statement

    import os
    import sys

    spark_home = os.environ.get('SPARK_HOME', None)
    print(spark_home)
    spark_home= spark_home+"/python"
    sys.path.insert(0, spark_home)
    sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-  src.zip'))

    filename = os.path.join(spark_home, 'pyspark/shell.py')
    print(filename)


    spark_release_file = spark_home + "/RELEASE"

    if os.path.exists(spark_release_file) and "Spark 1.5" in   open(spark_release_file).read():
        argsstr= "--master yarn  pyspark-shell ";
        pyspark_submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS", argsstr)
        if not "pyspark-shell" in pyspark_submit_args: 
            pyspark_submit_args += " pyspark-shell"
            os.environ["PYSPARK_SUBMIT_ARGS"] = pyspark_submit_args
        exec(compile(open(filename, "rb").read(), filename, 'exec'))

这篇关于在Windows上从Jupyter笔记本启动PySpark时出现错误消息的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆