Spark独立模式:工作者无法在cloudera中正常启动 [英] Spark Standalone Mode: Worker not starting properly in cloudera

查看:187
本文介绍了Spark独立模式:工作者无法在cloudera中正常启动的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是火花的新手,使用cloudera管理器中可用的包裹安装了火花之后.

I am new to the spark, After installing the spark using parcels available in the cloudera manager.

我已经配置了文件,如cloudera Enterprise的以下链接所示:

I have configured the files as shown in the below link from cloudera enterprise:

完成此设置后,我通过运行/opt/cloudera/parcels/SPARK/lib/spark/sbin/start-all.sh启动了spark中的所有节点.但是由于出现以下指定的错误,我无法运行工作程序节点.

After this setup, I have started all the nodes in the spark by running /opt/cloudera/parcels/SPARK/lib/spark/sbin/start-all.sh. But I couldn't run the worker nodes as I got the specified error below.

[root@localhost sbin]# sh start-all.sh
org.apache.spark.deploy.master.Master running as process 32405. Stop it first.
root@localhost.localdomain's password: 
localhost.localdomain: starting org.apache.spark.deploy.worker.Worker, logging to /var/log/spark/spark-root-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost.localdomain: failed to launch org.apache.spark.deploy.worker.Worker:
localhost.localdomain:      at java.lang.ClassLoader.loadClass(libgcj.so.10)
localhost.localdomain:      at gnu.java.lang.MainThread.run(libgcj.so.10)
localhost.localdomain: full log in /var/log/spark/spark-root-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost.localdomain:starting org.apac

当我运行jps命令时,我得到了:

When I run jps command, I got:

23367 Jps
28053 QuorumPeerMain
28218 SecondaryNameNode
32405 Master
28148 DataNode
7852 Main
28159 NameNode

我无法正确运行工作程序节点.实际上,我想过要安装一个独立的Spark,使主服务器和工人在同一台机器上工作.在spark目录的slaves文件中,我将地址指定为"localhost.localdomin",这是我的主机名.我不知道此设置文件.请任何一个云帮助我完成此安装过程.实际上,我无法运行工作程序节点.但是我可以启动主节点.

I couldn't run the worker node properly. Actually I thought to install a standalone spark where the master and worker work on a single machine. In slaves file of spark directory, I given the address as "localhost.localdomin" which is my host name. I am not aware of this settings file. Please any one cloud help me out with this installation process. Actually I couldn't run the worker nodes. But I can start the master node.

感谢&问候, Bips

Thanks & Regards, bips

推荐答案

请注意以下错误信息:

localhost.localdomain:      at java.lang.ClassLoader.loadClass(libgcj.so.10)

在确保已安装 libgcj.x86_64 libgcj.i686 之后,在CentOS 6.2 x86_64上安装并启动Spark master/workers时遇到了相同的错误.我的服务器,终于我解决了.以下是我的解决方案,希望它能为您提供帮助.

看来您的JAVA_HOME环境参数设置不正确.
也许您的JAVA_HOME链接到系统嵌入式Java,例如Java版本"1.5.0".
Spark需要Java版本> = 1.6.0.如果使用Java 1.5.0启动Spark,则会看到此错误信息.
尝试导出JAVA_HOME =您的Java主目录路径",然后再次启动Spark.

I met the same error when I installed and started Spark master/workers on CentOS 6.2 x86_64 after making sure that libgcj.x86_64 and libgcj.i686 had been installed on my server, finally I solved it. Below is my solution, wish it can help you.

It seem as if your JAVA_HOME environment parameter didn't set correctly.
Maybe, your JAVA_HOME links to system embedded java, e.g. java version "1.5.0".
Spark needs java version >= 1.6.0. If you are using java 1.5.0 to start Spark, you will see this error info.
Try to export JAVA_HOME="your java home path", then start Spark again.

这篇关于Spark独立模式:工作者无法在cloudera中正常启动的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆