在 Windows 中启动 Spark 的 master 失败 [英] Failed to start master for Spark in Windows

查看:49
本文介绍了在 Windows 中启动 Spark 的 master 失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

无法在 windows 中启动 master for spark 的问题相同10 也没有解决.

通过测试 pyspark.cmd 和 spark-shell.cmd,我的 spark 运行良好

My spark is working well by testing pyspark.cmd and spark-shell.cmd

运行 .sbinstart-master.sh 我得到了:

ps: unknown option -- o
Try 'ps --help' for more information.
starting org.apache.spark.deploy.master.Master, logging to C:spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out
ps: unknown option -- o
Try 'ps --help' for more information.
failed to launch org.apache.spark.deploy.master.Master:
  ========================================
  Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M
full log in C:spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out

我尝试访问 Web UI,而 localhost:4040 正在工作,无法访问 localhost:8080.

I tried to visit web UI, while the localhost:4040 is working the localhost:8080 cannot be reached.

我发现在 %SPARK_HOME%/logs 的文件夹中创建了 .log 文件.它们包含相同的内容:

And I found there is the .log file created at the folder of %SPARK_HOME%/logs . They contains same content:

Spark 命令:

C:Program FilesJavajdk1.7.0_79injava -cp C:spark-1.6.1-bin-hadoop2.6/conf;C:spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar;C:spark-1.6.1-bin-hadoop2.6libdatanucleus-api-jdo-3.2.6.jar;C:spark-1.6.1-bin-hadoop2.6libdatanucleus-core-3.2.10.jar;C:spark-1.6.1-bin-hadoop2.6libdatanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip hahaha-PC --port 7077 --webui-port 8080

========================================
Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M

工作环境:火花:1.6.1视窗 10

Working environment: Spark: 1.6.1 Windows 10

期待您的回复,非常感谢您抽出宝贵时间!

Looking forward to your reply and thanks for your time so much!

推荐答案

刚刚在这里找到答案:https://spark.apache.org/docs/1.2.0/spark-standalone.html

Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html

注意:启动脚本目前不支持 Windows.要在 Windows 上运行 Spark 集群,请手动启动 master 和 worker."

"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

这篇关于在 Windows 中启动 Spark 的 master 失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆