Systemd Hdfs服务[hadoop]-启动 [英] Systemd Hdfs Service [hadoop] - startup

查看:87
本文介绍了Systemd Hdfs服务[hadoop]-启动的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经创建了一个服务来启动和停止与我的Spark集群关联的hdfs.
服务:

I have created a service that starts and stops my hdfs that is associated to my spark cluster.
the service :

[Unit]
Description=Hdfs service
[Service]
Type=simple
WorkingDirectory=/home/hduser
ExecStart=/opt/hadoop-2.6.4/sbin/start-service-hdfs.sh
ExecStop=/opt/hadoop-2.6.4/sbin/stop-service-hdfs.sh
[Install]
WantedBy=multi-user.target

问题是当我启动该服务时,它在启动后立即启动和停止!:)我认为问题出在服务的类型,我真的不知道该选择哪种类型...

The problem is when i start the service, it starts and stops just after been started !! :) I think the problem is the type of the service, I don't really know what type to choose ...

谢谢.
最好的问候

Thank you.
Best regards

推荐答案

配置中有一些问题,这就是为什么它不起作用的原因.

Threre are some issues in your config, that is why it is not working.

我正在 hadoop 用户

HADOOP_HOME /home/hadoop/envs/dwh/hadoop/

[Unit]
Description=Hadoop DFS namenode and datanode
After=syslog.target network.target remote-fs.target nss-lookup.target network-online.target
Requires=network-online.target

[Service]
User=hadoop
Group=hadoop
Type=forking
ExecStart=/home/hadoop/envs/dwh/hadoop/sbin/start-dfs.sh
ExecStop=/home/hadoop/envs/dwh/hadoop/sbin/stop-dfs.sh
WorkingDirectory=/home/hadoop/envs/dwh
Environment=JAVA_HOME=/usr/lib/jvm/java-8-oracle
Environment=HADOOP_HOME=/home/hadoop/envs/dwh/hadoop
TimeoutStartSec=2min
Restart=on-failure
PIDFile=/tmp/hadoop-hadoop-namenode.pid

[Install]
WantedBy=multi-user.target

清单:

  • 已设置用户和用户组
  • 服务类型为 fork
  • pid文件已设置,这是 start-dfs.sh 创建的实际pid
  • 环境变量正确

这篇关于Systemd Hdfs服务[hadoop]-启动的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆