PySpark 2.4.5与Python 3.8.3不兼容,我该如何解决? [英] PySpark 2.4.5 is not compatible with Python 3.8.3, how do I solve this?

查看:208
本文介绍了PySpark 2.4.5与Python 3.8.3不兼容,我该如何解决?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

代码

from pyspark import SparkContext,SparkConf

conf=SparkConf().setMaster('local').setAppName('Test App')
sc=SparkContext(conf)

错误消息

    Traceback (most recent call last):
      File "C:\Users\Test\PycharmProjects\python-test\MainFile.py", line 5, in <module>
        from pyspark import SparkContext,SparkConf
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\__init__.py", line 51, in <module>
        from pyspark.context import SparkContext
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\context.py", line 31, in <module>
        from pyspark import accumulators
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\accumulators.py", line 97, in <module>
        from pyspark.serializers import read_int, PickleSerializer
      File "C:\Test\Python_3.8.3_Latest\lib\sit`enter code here`e-packages\pyspark\serializers.py", line 72, in <module>
        from pyspark import cloudpickle
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 145, in <module>
        _cell_set_template_code = _make_cell_set_template_code()
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 126, in _make_cell_set_template_code
        return types.CodeType(
    TypeError: an integer is required (got type bytes)

推荐答案

尽管最新的Spark文档说它支持 Python 2.7 +/3.4 + ,但实际上它还不支持Python 3.8..根据 PR,Spark 3.0有望支持Python 3.8.因此,您可以试用Spark 3.0预览版(假设您不打算进行生产部署),也可以暂时"回退至Spark 3.6.x的Python 3.6/3.7.

Although latest Spark doc says that it has support for Python 2.7+/3.4+, it actually doesn't support Python 3.8 yet. According to this PR, Python 3.8 support is expected in Spark 3.0. So, either you can try out Spark 3.0 preview release (assuming you're not gonna do a production deployment) or 'temporarily' fall back to Python 3.6/3.7 for Spark 2.4.x.

这篇关于PySpark 2.4.5与Python 3.8.3不兼容,我该如何解决?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆