无法启动Spark历史记录服务器 [英] cannot start spark history server

查看:169
本文介绍了无法启动Spark历史记录服务器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在纱线簇上运行火花.我试图启动历史记录服务器

I am running spark on yarn cluster. I tried to start the history server

./start-history-server.sh

但出现以下错误.

starting org.apache.spark.deploy.history.HistoryServer, logging to /home/abc/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-abc-org.apache.spark.deploy.history.HistoryServer-1-abc-Efg.out
failed to launch org.apache.spark.deploy.history.HistoryServer:
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:47)
... 6 more
full log in /home/abc/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-abc-org.apache.spark.deploy.history.HistoryServer-1-abc-Efg.out

我已将spark.eventLog.enabled = true以及spark.history.fs.logDirectory和spark.eventLog.dir设置为hdfs日志记录目录.

I have set spark.eventLog.enabled = true and spark.history.fs.logDirectory and spark.eventLog.dir to the hdfs logging directory.

为什么我无法启动历史记录服务器?

Why can't I get the history server to start?

更新1:

感谢stf告诉我查看日志文件;我不知道它的存在!

Thank you stf for telling me to look at the log file; I didn't know it exists!

我意识到我的问题出在 spark-env.sh

I realise my problem is in my setting in spark-env.sh

 export SPARK_HISTORY_OPTS="-Dspark.eventLog.enabled=true -Dspark.eventLog.dir=hdfs:///localhost/eventLogging spark.history.fs.logDirectory=hdfs:///localhost/eventLogging"

正斜杠变成点

 Error: Could not find or load main class spark.history.fs.logDirectory=hdfs:...localhost.eventLogging

有什么想法可以防止这种情况发生吗?

Any idea how to prevent this from happening?

更新2:在stf的帮助下解决了此问题.spark-env.sh中的正确设置是

Update 2: Solved this problem thanks to stf's help. Correct setting in spark-env.sh is

 SPARK_HISTORY_OPTS="$SPARK_HISTORY_OPTS -Dspark.eventLog.enabled=true -Dspark.eventLog.dir=hdfs://localhost/eventLogging -Dspark.history.fs.logDirectory=hdfs://localhost/eventLogging"

推荐答案

对于那些仍然遇到此错误且注释讨论无济于事的人.以下内容为我解决了此问题.确保在 spark/conf/spark-defaults.conf

For those still getting this error and not helped by the comment discussion. The following resolved this issue for me. Make sure that you have the following in in spark/conf/spark-defaults.conf

spark.eventLog.enabled          true
spark.eventLog.dir              /path/to/spark/logs
spark.history.fs.logDirectory   /path/to/spark/logs

然后运行 spark/sbin/start-history-server.sh/path/to/spark/logs

这篇关于无法启动Spark历史记录服务器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆