无法将ipython识别为内部或外部命令(pyspark) [英] ipython is not recognized as an internal or external command (pyspark)

查看:285
本文介绍了无法将ipython识别为内部或外部命令(pyspark)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经安装了spark发行版:spark-2.2.0-bin-hadoop2.7.

I have installed spark the release: spark-2.2.0-bin-hadoop2.7.

我正在使用Windows 10操作系统

我的Java版本1.8.0_144

我已经设置了环境变量:

I have set my environment variables:

SPARK_HOME D:\spark-2.2.0-bin-hadoop2.7

HADOOP_HOME D:\Hadoop ( where I put bin\winutils.exe )

PYSPARK_DRIVER_PYTHON ipython

PYSPARK_DRIVER_PYTHON_OPTS notebook

路径为D:\spark-2.2.0-bin-hadoop2.7\bin

当我从命令行启动pyspark时,出现此错误:

When I launch pyspark from command line I have this error:

ipython is not recognized as an internal or external command

我也尝试在jupyter中设置PYSPARK_DRIVER_PYTHON,但这给了我相同的错误(无法识别为内部或外部命令).

I tried also to set PYSPARK_DRIVER_PYTHON in jupyter but and it's giving me the same error (not recognized as an internal or external command).

请帮忙吗?

推荐答案

在您的计算机中搜索ipython应用程序,在我的情况下,该应用程序位于"c:\ Anaconda3 \ Scripts"中.然后只需将该路径添加到PATH环境变量中即可

Search in your machine the ipython application, in my case it is in "c:\Anaconda3\Scripts". Then just add that path to the PATH Environment Variables

这篇关于无法将ipython识别为内部或外部命令(pyspark)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆