运行pyspark时系统找不到指定的路径错误 [英] The system cannot find the path specified error while running pyspark

查看:641
本文介绍了运行pyspark时系统找不到指定的路径错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚下载了spark-2.3.0-bin-hadoop2.7.tgz.下载后,我按照此处提到的步骤为Windows 10安装pyspark .我使用了注释框\ pyspark来运行火花&收到错误消息

I just downloaded spark-2.3.0-bin-hadoop2.7.tgz. After downloading I followed the steps mentioned here pyspark installation for windows 10.I used the comment bin\pyspark to run the spark & got error message

The system cannot find the path specified

附件是错误消息的屏幕截图

Attached is the screen shot of error message

附件是我的Spark Bin文件夹的屏幕截图

Attached is the screen shot of my spark bin folder

我的path变量的屏幕截图如下

Screen shot of my path variable looks like

我有python 3.6&我的Windows 10系统中的Java"1.8.0_151" 您能建议我如何解决此问题吗?

I have python 3.6 & Java "1.8.0_151" in my windows 10 system Can you suggest me how to resolve this issue?

推荐答案

实际上,问题出在JAVA_HOME环境变量路径. JAVA_HOME路径设置为.../jdk/bin previously

Actually, the problem was with the JAVA_HOME environment variable path. The JAVA_HOME path was set to .../jdk/bin previously,

我删除了JAVA_HOME的最后一个/bin部分,同时将它(/jdk/bin)保留在系统或环境路径变量(%path%)中.

I stripped the last /bin part for JAVA_HOME while keeping it (/jdk/bin) in system or environment path variable (%path%) did the trick.

这篇关于运行pyspark时系统找不到指定的路径错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆