Pyspark Launch 问题 Windows 10 wxith Python 3.6 [英] Pyspark Launch issue Windows 10 wxith Python 3.6

查看:71
本文介绍了Pyspark Launch 问题 Windows 10 wxith Python 3.6的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用 Anaconda 安装 Python 3.x 后,我无法在 Windows 中启动 Pyspark.出现以下错误 -

I am unable to launch Pyspark in windows after installing Python 3.x with Anaconda. Getting below error -

Python 3.6.0 |Anaconda 4.3.0 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\bin\..\python\pyspark\shell.py", line 30, in <module>
    import pyspark
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\python\pyspark\__init__.py", line 44, in <module>
    from pyspark.context import SparkContext
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", line 36, in <module>
    from pyspark.java_gateway import launch_gateway
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\python\pyspark\java_gateway.py", line 31, in <module>
    from py4j.java_gateway import java_import, JavaGateway, GatewayClient
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py", line 18, in <module>
  File "C:\Users\prudra\AppData\Local\Continuum\Anaconda3\lib\pydoc.py", line 62, in <module>
    import pkgutil
  File "C:\Users\prudra\AppData\Local\Continuum\Anaconda3\lib\pkgutil.py", line 22, in <module>
    ModuleInfo = namedtuple('ModuleInfo', 'module_finder name ispkg')
  File "C:\Users\prudra\Desktop\Udemy\spark-2.1.0-bin-hadoop2.7\python\pyspark\serializers.py", line 393, in namedtuple
    cls = _old_namedtuple(*args, **kwargs)
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'rename', and 'module'

请告诉我如何解决

推荐答案

Spark 2.1.1 于 5 月 4 日刚刚发布.它现在使用 Python 3.6,您可以在 此处查看发行说明.

Spark 2.1.1 just released on May 4th. It's now working with Python 3.6, you can see the release note here.

这篇关于Pyspark Launch 问题 Windows 10 wxith Python 3.6的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆