为什么从根目录启动all.sh会导致“无法启动org.apache.spark.deploy.master.Master:未设置JAVA_HOME”? [英] Why does start-all.sh from root cause "failed to launch org.apache.spark.deploy.master.Master: JAVA_HOME is not set"?

查看:528
本文介绍了为什么从根目录启动all.sh会导致“无法启动org.apache.spark.deploy.master.Master:未设置JAVA_HOME”?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图通过运行在cloudera quickstart VM 5.3.0上的独立Spark服务执行通过Scala IDE构建的Spark应用程序。

I am trying to execute a Spark application built through Scala IDE through my standalone Spark service running on cloudera quickstart VM 5.3.0.

我的cloudera帐户JAVA_HOME是/ usr / java / default

My cloudera account JAVA_HOME is /usr/java/default

但是,从 cloudera 用户如下:

[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password: 
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out' for reading: No such file or directory
localhost: full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out

我已经添加了 export CMF_AGENT_JAVA_HOME = / usr / java / default / etc / default / cloudera-scm-agent 中的$ c>并运行 sudo服务cloudera-scm-agent restart 。请参阅如何设置CMF_AGENT_JAVA_HOME

I had added export CMF_AGENT_JAVA_HOME=/usr/java/default in /etc/default/cloudera-scm-agent and run sudo service cloudera-scm-agent restart. See How to set CMF_AGENT_JAVA_HOME

我还添加了 export JAVA_HOME = / usr / java / default / usr / share / cmf / bin / cmf-server 中的 locate_java_home 函数定义,并重新启动集群和独立的Spark服务

I had also added export JAVA_HOME=/usr/java/default in locate_java_home function definition in file /usr/share/cmf/bin/cmf-server and restarted the cluster and standalone Spark service

但是从 root 用户启动Spark服务时重复以下错误

But the below error is repeating while starting spark service from root user

[root@localhost spark]# sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
failed to launch org.apache.spark.deploy.master.Master:
  JAVA_HOME is not set
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
root@localhost's password: 
localhost: Connection closed by UNKNOWN

有人可以建议如何设置JAVA_HOME以便在cloudera manager上启动Spark独立服务吗?

Can anybody suggest how to set JAVA_HOME so as to start Spark standalone service on cloudera manager?

推荐答案

该解决方案非常简单明了。只需在 /root/.bashrc 中添加 export JAVA_HOME = / usr / java / default 即可成功启动spark服务没有设置 JAVA_HOME的 root 用户的权限。希望它可以帮助面临同样问题的人。

The solution had been quite easy and straightforward. Just added export JAVA_HOME=/usr/java/default in /root/.bashrc and it successfully started the spark services from root user without the JAVA_HOME is not set error. Hope it helps somebody facing same problem.

这篇关于为什么从根目录启动all.sh会导致“无法启动org.apache.spark.deploy.master.Master:未设置JAVA_HOME”?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆