无法在 mac os 中打开 pyspark [英] Unable to open pyspark in mac os

查看:87
本文介绍了无法在 mac os 中打开 pyspark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经通过 pip 安装了 pyspark 但无法打开它.它显示以下错误.

I have installed pyspark through pip but unable to open it. It shows following error .

Users/sonveer.narwaria/anaconda/bin/pyspark:第 24 行:/Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/load-spark-env.sh:没有相应的文件和目录/Users/sonveer.narwaria/anaconda/bin/pyspark:第 77 行:/Users/sonveer.narwaria//Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/spark-submit:没有相应的文件和目录/Users/sonveer.narwaria/anaconda/bin/pyspark:第77行:执行:/Users/sonveer.narwaria//Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/spark-submit:无法执行:没有那个文件或目录

Users/sonveer.narwaria/anaconda/bin/pyspark: line 24: /Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/load-spark-env.sh: No such file or directory /Users/sonveer.narwaria/anaconda/bin/pyspark: line 77: /Users/sonveer.narwaria//Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/spark-submit: No such file or directory /Users/sonveer.narwaria/anaconda/bin/pyspark: line 77: exec: /Users/sonveer.narwaria//Users/sonveer.narwaria/anaconda/lib/python3.6/site-packages/pyspark/bin/spark-submit: cannot execute: No such file or directory

推荐答案

您应该下载完整的 Spark 发行版,如这里.PySpark 的 PyPi 安装(即通过 pip,正如您所做的那样)仅适用于连接现有的 Spark 集群;来自文档:

You should download a full Spark distribution as described here. PyPi installations of PySpark (i.e through pip, as you did) are suitable only for connecting with an already existing Spark cluster; from the docs:

Spark 的 Python 打包并不打算取代所有其他用例.这个 Python 打包版本的 Spark 适合与现有集群(无论是 Spark 独立、YARN 还是 Mesos)交互,但不包含设置您自己的独立 Spark 集群所需的工具.您可以从 Apache Spark 下载页面下载完整版的 Spark.

The Python packaging for Spark is not intended to replace all of the other use cases. This Python packaged version of Spark is suitable for interacting with an existing cluster (be it Spark standalone, YARN, or Mesos) - but does not contain the tools required to setup your own standalone Spark cluster. You can download the full version of Spark from the Apache Spark downloads page.

注意:如果您将它与 Spark 独立集群一起使用,您必须确保版本(包括次要版本)匹配,否则您可能遇到奇怪的错误

NOTE: If you are using this with a Spark standalone cluster you must ensure that the version (including minor version) matches or you may experience odd errors

这篇关于无法在 mac os 中打开 pyspark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆