错误:找不到或加载主类org.apache.hadoop.hdfs.server.namenode.NameNode尝试了所有解决方案,但错误仍然存​​在 [英] Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode Tried all solution still error persists

查看:81
本文介绍了错误:找不到或加载主类org.apache.hadoop.hdfs.server.namenode.NameNode尝试了所有解决方案,但错误仍然存​​在的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在按照教程安装hadoop.使用hadoop 1.x进行了解释,但是我使用的是hadoop-2.6.0

I am following tutorial to install hadoop. It is explained with hadoop 1.x but I am using hadoop-2.6.0

在执行以下cmd之前,我已经成功完成了所有步骤.

I have successfully completed all the step just before executing following cmd.

bin/hadoop namenode -format

bin/hadoop namenode -format

执行上述命令时出现以下错误.

I am getting the following error when I execute the above command.

错误:找不到或加载主类org.apache.hadoop.hdfs.server.namenode.NameNode

Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

我的hadoop-env.sh文件

My hadoop-env.sh file

 The java implementation to use.
export JAVA_HOME="C:/Program Files/Java/jdk1.8.0_74"

# The jsvc implementation to use. Jsvc is required to run secure datanodes
# that bind to privileged ports to provide authentication of data transfer
# protocol.  Jsvc is not required if SASL is configured for authentication of
# data transfer protocol using non-privileged ports.
#export JSVC_HOME=${JSVC_HOME}

export HADOOP_PREFIX="/home/582092/hadoop-2.6.0"

export HADOOP_HOME="/home/582092/hadoop-2.6.0"
export HADOOP_COMMON_HOME=$HADOOP_HOME
#export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_HDFS_HOME=$HADOOP_HOME
export PATH=$PATH:$HADOOP_PREFIX/bin

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_HOME/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}

core-site.xml

core-site.xml图像

hdfs-site.xml

dfs.data.dir/home/582092/hadoop-dir/datadirdfs.name.dir/home/582092/hadoop-dir/namedir

dfs.data.dir /home/582092/hadoop-dir/datadir dfs.name.dir /home/582092/hadoop-dir/namedir


请帮助我解决此问题.


Kindly help me in fixing this issue.

推荐答案

造成此问题的一个原因可能是用户定义的 HDFS_DIR 环境变量.这是由脚本完成的,例如 libexec/hadoop-functions.sh 中的以下行:

One cause behind this problem might be a user-defined HDFS_DIR environment variable. This is picked up by scripts such as the following lines in libexec/hadoop-functions.sh:

HDFS_DIR=${HDFS_DIR:-"share/hadoop/hdfs"}
...
if [[ -z "${HADOOP_HDFS_HOME}" ]] &&
   [[ -d "${HADOOP_HOME}/${HDFS_DIR}" ]]; then
  export HADOOP_HDFS_HOME="${HADOOP_HOME}"
fi

解决方案是避免定义环境变量 HDFS_DIR .

The solution is to avoid defining an environment variable HDFS_DIR.

问题注释中的建议是正确的–使用 hadoop classpath 命令来识别 hadoop-hdfs-*.jar 文件是否存在于类路径中.在我的情况下,他们不见了.

The recommendations in the comments of question are correct – use the hadoop classpath command to identify whether hadoop-hdfs-*.jar files are present in the classpath or not. They were missing in my case.

这篇关于错误:找不到或加载主类org.apache.hadoop.hdfs.server.namenode.NameNode尝试了所有解决方案,但错误仍然存​​在的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆