如何在Windows中使用pyspark启动Spark Shell? [英] How to start a Spark Shell using pyspark in Windows?

查看:642
本文介绍了如何在Windows中使用pyspark启动Spark Shell?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Spark的初学者,并尝试遵循此处的有关如何使用cmd从Python初始化Spark shell的说明:

I am a beginner in Spark and trying to follow instructions from here on how to initialize Spark shell from Python using cmd: http://spark.apache.org/docs/latest/quick-start.html

但是当我在cmd中运行以下命令时:

But when I run in cmd the following:

C:\Users\Alex\Desktop\spark-1.4.1-bin-hadoop2.4\>c:\Python27\python bin\pyspark 

然后我收到以下错误消息:

then I receive the following error message:

File "bin\pyspark", line 21 
export SPARK_HOME="$(cd ="$(cd "`dirname "$0"`"/..; pwd)" 
SyntaxError: invalid syntax

我在这里做错了什么?

What am I doing wrong here?

P.S.在cmd中时,我只尝试C:\Users\Alex\Desktop\spark-1.4.1-bin-hadoop2.4>bin\pyspark

P.S. When in cmd I try just C:\Users\Alex\Desktop\spark-1.4.1-bin-hadoop2.4>bin\pyspark

然后我收到""python" is not recognized as internal or external command, operable program or batch file".

推荐答案

您需要在系统路径中提供Python,您可以使用setx添加它:

You need to have Python available in the system path, you can add it with setx:

setx path "%path%;C:\Python27"

这篇关于如何在Windows中使用pyspark启动Spark Shell?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆