无法启动主在Windows 10火花 [英] Failed to start master for spark in windows 10

查看:443
本文介绍了无法启动主在Windows 10火花的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在新的火花,我试图手动启动主服务(在Windows 10使用MINGW64)。所以,当我做到这一点,
〜/下载/火花1.5.1彬hadoop2.4 /火花1.5.1彬hadoop2.4 / sbin目录
$ ./start-master.sh

I am new in spark ,i am trying to start master service manually (using MINGW64 in windows 10). So when I do this, ~/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin $ ./start-master.sh

我得到了这些日志,
PS:未知的选项 - 0

I got these logs, ps: unknown option -- o

尝试 PS --help'以获取更多信息。
  开始org.apache.spark.deploy.master.Master,记录到/c/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../logs/spark--org.apache.spark.deploy.master.Master-1-RINKU-CISPL.out
  PS:未知的选项 - Ø
  尝试
PS --help'以获取更多信息。
  无法启动org.apache.spark.deploy.master.Master:
    星火命令:C:\\ Program Files文件\\的Java \\ jre1.8.0_77 \\ BIN \\ java命令C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../conf\\;C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/lib/spark-assembly-1.5.1-hadoop2.4.0.jar;C:\\Users\\Raunak\\Downloads\\spark-1.5.1-bin-hadoop2.4\\spark-1.5.1-bin-hadoop2.4\\lib\\datanucleus-api-jdo-3.2.6.jar;C:\\Users\\Raunak\\Downloads\\spark-1.5.1-bin-hadoop2.4\\spark-1.5.1-bin-hadoop2.4\\lib\\datanucleus-core-3.2.10.jar;C:\\Users\\Raunak\\Downloads\\spark-1.5.1-bin-hadoop2.4\\spark-1.5.1-bin-hadoop2.4\\lib\\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip临空-CISPL --port 7077 --webui端口8080

Try ps --help' for more information. starting org.apache.spark.deploy.master.Master, logging to /c/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../logs/spark--org.apache.spark.deploy.master.Master-1-RINKU-CISPL.out ps: unknown option -- o Tryps --help' for more information. failed to launch org.apache.spark.deploy.master.Master: Spark Command: C:\Program Files\Java\jre1.8.0_77\bin\java -cp C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/sbin/../conf\;C:/Users/Raunak/Downloads/spark-1.5.1-bin-hadoop2.4/spark-1.5.1-bin-hadoop2.4/lib/spark-assembly-1.5.1-hadoop2.4.0.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-api-jdo-3.2.6.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-core-3.2.10.jar;C:\Users\Raunak\Downloads\spark-1.5.1-bin-hadoop2.4\spark-1.5.1-bin-hadoop2.4\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip RINKU-CISPL --port 7077 --webui-port 8080

我做错了,我应该也必须配置Hadoop的包火花?

What I am doing wrong , Should I have to configure Hadoop package also for Spark?

推荐答案

刚刚找到答案在这里:的 https://spark.apache.org/docs/1.2.0/spark-standalone.html

Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html

请注意:该启动脚本目前不支持Windows要在Windows上运行一个集群星火,开始手工师傅和工人

"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

我觉得Windows不是火花个不错的选择,反正好运气!

I think windows is not a good choice for Spark, anyway good luck!

这篇关于无法启动主在Windows 10火花的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆