无法在Windows中启动Spark的master [英] Failed to start master for Spark in Windows

查看:553
本文介绍了无法在Windows中启动Spark的master的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

>无法启动Windows中的Spark主机相同的问题10 ,这也没有解决.

通过测试pyspark.cmd和spark-shell.cmd,我的火花效果很好

My spark is working well by testing pyspark.cmd and spark-shell.cmd

运行.\sbin\start-master.sh之后,我得到了:

ps: unknown option -- o
Try 'ps --help' for more information.
starting org.apache.spark.deploy.master.Master, logging to C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out
ps: unknown option -- o
Try 'ps --help' for more information.
failed to launch org.apache.spark.deploy.master.Master:
  ========================================
  Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M
full log in C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out

我试图访问Web UI,而localhost:4040正在运行,但无法访问localhost:8080.

I tried to visit web UI, while the localhost:4040 is working the localhost:8080 cannot be reached.

我发现在%SPARK_HOME%/logs文件夹中创建了.log文件.它们包含相同的内容:

And I found there is the .log file created at the folder of %SPARK_HOME%/logs . They contains same content:

火花命令:

C:\Program Files\Java\jdk1.7.0_79\bin\java -cp C:\spark-1.6.1-bin-hadoop2.6/conf\;C:\spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-api-jdo-3.2.6.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-core-3.2.10.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip hahaha-PC --port 7077 --webui-port 8080

========================================
Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M

工作环境: 斯巴达克:1.6.1 Windows 10

Working environment: Spark: 1.6.1 Windows 10

期待您的回复,并感谢您的宝贵时间!

Looking forward to your reply and thanks for your time so much!

推荐答案

在这里找到答案: https://spark.apache.org/docs/1.2.0/spark-standalone.html

注意:启动脚本当前不支持Windows.要在Windows上运行Spark集群,请手动启动master和worker."

"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

这篇关于无法在Windows中启动Spark的master的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆