无法找到或加载主类org.apache.hadoop.fs.FsShell [英] Could not find or load main class org.apache.hadoop.fs.FsShell

查看:686
本文介绍了无法找到或加载主类org.apache.hadoop.fs.FsShell的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道这个问题可能已经被回答了,好吧,我的问题仍然存在:



我使用CentOS7为vmware上的hadoop创建了一个虚拟机,我可以启动namenode和datanode,但是,当我尝试使用以下命令查看hdfs文件时:

  hdfs dfs -ls 

它会在下面输出一个错误:

 无法找到或加载主类org.apache.hadoop.fs.FsShell 

我的谷歌搜索表明这可能与hadoop变量在bash中的设置有关,这里是我的设置:

 #.bashrc 
#源全局定义
if [-f / etc / bashrc];然后
。 / etc / bashrc
fi
export HADOOP_HOME = / opt / hadoop / hadoop-2.7.2
export HADOOP_MAPRED_HOME = $ HADOOP_HOME
export HADOOP_COMMON_HOME = $ HADOOP_HOME
export HADOOP_HDFS_HOME = $ HADOOP_HOME
export HADOOP_YARN_HOME = $ HADOOP_HOME
export HADOOP_CONF_DIR = $ HADOOP_HOME / etc / hadoop
export HADOOP_PREFIX = $ HADOOP_HOME
$ b $ export HIVE_HOME = / opt / hadoop / hive
export PATH = $ HIVE_HOME / bin:$ PATH
$ b $ export ANT_HOME = / usr / local / apache-ant-1.9.7
export PATH = $ {PATH}: $ {JAVA_HOME} / bin
$ b $ export PIG_HOME = / opt / hadoop / pig-0.15.0
export PIG_HADOOP_VERSION = 0.15.0
export PIG_CLASSPATH = $ HADOOP_HOME / etc / hadoop

export PATH = $ PATH:$ PIG_HOME / bin
export PATH = $ PATH:$ HADOOP_HOME / bin
export HADOOP_USER_CLASSPATH_FIRST = true

export SQOOP_HOME = / usr / lib / sqoop
export PATH = $ PATH:$ SQOOP_HOME / bin

export HADOOP_CLASSPATH = $ HADOOP_HOME / share / hadoop / common / $ b $ export PATH = $ PATH :$ HADOOP_CLASSPATH

#取消注释fo如果您不喜欢systemctl的自动分页功能,则可以使用行:

#export SYSTEMD_PAGER =
#用户特定的别名和函数

我检查了我的hadoop文件夹:/opt/hadoop/hadoop-2.7.2/share/hadoop/common,这里是列表:



我正在使用root帐户进行这种练习,任何人都可以帮助找出问题的原因并解决它吗?

解决方案

当你有多个hadoop实例时,通常会发生这种情况,请检查hadoop,看看它是否指出如果它指向/ usr / bin / hadoop而不是/ your-path / hadoop,那么你可以指向/ usr / bin目录下的/ usr / bin / hadoop(使用符号链接)


I understand this question might have been answered already, well, my issue is still here:

I have a vm created for hadoop on vmware using CentOS7, I can start namenode and datanode, however, when I tried to view hdfs file using the following command:

hdfs dfs -ls

it throws out an error below:

Could not find or load main class org.apache.hadoop.fs.FsShell

My google searchings suggest this might relate to hadoop variables setting in bash, here is my settings:

# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
export HADOOP_HOME=/opt/hadoop/hadoop-2.7.2
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME

export HIVE_HOME=/opt/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH

export ANT_HOME=/usr/local/apache-ant-1.9.7
export PATH=${PATH}:${JAVA_HOME}/bin

export PIG_HOME=/opt/hadoop/pig-0.15.0
export PIG_HADOOP_VERSION=0.15.0
export PIG_CLASSPATH=$HADOOP_HOME/etc/hadoop

export PATH=$PATH:$PIG_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export HADOOP_USER_CLASSPATH_FIRST=true

export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin

export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/
export PATH=$PATH:$HADOOP_CLASSPATH

# Uncomment the following line if you don't like systemctl's auto-paging feature
:
# export SYSTEMD_PAGER=
# User specific aliases and functions

I checked my hadoop folder: /opt/hadoop/hadoop-2.7.2/share/hadoop/common, here is the list:

I am doing this practice using root account, can anyone help to find out where is the cause of this issue and fix it? Thank you very much.

解决方案

this typically happens when you have multiple instances of hadoop, check which hadoop and see if its pointing out to the version that you have installed.

say if it points to /usr/bin/hadoop and not /your-path/hadoop, then you can point /usr/bin/hadoop to that (with symlink)

这篇关于无法找到或加载主类org.apache.hadoop.fs.FsShell的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆