Hadoop MapReduce NoSuchElementException [英] Hadoop MapReduce NoSuchElementException

查看:524
本文介绍了Hadoop MapReduce NoSuchElementException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在具有两个节点的FreeBSD-Cluster上运行MapReduce-Job,但出现以下异常:

  14 / 08/27 14:23:04 WARN util.NativeCodeLoader:无法为您的平台加载native-hadoop库......在适用的情况下使用builtin-java类
14/08/27 14:23:04信息配置.deprecation:session.id已被弃用。相反,使用dfs.metrics.session-id
14/08/27 14:23:04 INFO jvm.JvmMetrics:使用processName = JobTracker,sessionId = $ b $初始化JVM度量标准14/08/27 14: 23:04警告mapreduce.JobSubmitter:未执行Hadoop命令行选项解析。实施工具界面并使用ToolRunner执行您的应用程序以解决此问题。
14/08/27 14:23:04 WARN mapreduce.JobSubmitter:没有作业jar文件集。用户类可能找不到。请参阅作业或作业#setJar(String)。
14/08/27 14:23:04信息mapreduce.JobSubmitter:清理临时区域文件:/tmp/hadoop-otlam/mapred/staging/otlam968414084/.staging/job_local968414084_0001
线程中的异常mainjava.util.NoSuchElementException $ b $ java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
at org.apache.hadoop.fs.RawLocalFileSystem $ DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:
at org.apache.hadoop.fs.RawLocalFileSystem $ DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java: 162)
在org.apache.hadoop.mapreduce.filecache。 ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job $ 10.run(Job.java :1285)
at org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1282)$ b $ at java.security.AccessController.doPrivileged(Native Method)$ b $ javax。 security.auth.Subject.doAs(Subject.java:415)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
位于org.apache.hadoop.mapreduce。 Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
...

当我尝试在新的MapReduce上运行 job.watForCompletion(true); 时,会发生这种情况-j OB。应抛出NoSuchElementException,因为StringTokenizer和next()中没有其他元素被调用。
我查看了源代码并在RawLocalFileSystem.java中找到了以下代码部分:

  ///加载权限,owner和group从`ls -ld` 
private void loadPermissionInfo(){
IOException e = null;
尝试{
String output = FileUtil.execCommand(new File(getPath()。toUri()),
Shell.getGetPermissionCommand());
StringTokenizer t =
new StringTokenizer(输出,Shell.TOKEN_SEPARATOR_REGEX);
//预期格式
// - rw ------- 1用户名groupname ...
字符串权限= t.nextToken();

据我所见,Hadoop试图通过<$ c找出特定文件的某些权限$ c> ls -ld 如果我在控制台中使用它,它可以很好地工作。不幸的是,我还没有发现,它正在寻找哪些文件权限。



Hadoop版本是2.4.1,HBase版本是0.98.4,我是使用Java-API。其他操作,如创建表格工作正常。有没有人遇到类似的问题或知道该怎么办?



编辑:
我刚发现这是一个与hadoop相关的问题问题。即使在不使用HDFS的情况下制作最简单的MapReduce-Operation也会导致同样的异常。

你的问题。



如果你的是一个权限问题,那么这个问题就可以解决。

  //设置用户组信息
UserGroupInformation ugi = UserGroupInformation.createRemoteUser(hdfs);
// set privilege exception
ugi.doAs(new PrivilegedExceptionAction< Void>(){
public void run()throws Exception {
//创建配置对象
Configuration config = new Configuration();
config.set(fs.defaultFS,hdfs:// ip:port /);
config.set(hadoop.job.ugi, hdfs);
FileSystem dfs = FileSystem.get(config);



I wanted to run a MapReduce-Job on my FreeBSD-Cluster with two nodes but I get the following Exception

14/08/27 14:23:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/08/27 14:23:04 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/08/27 14:23:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/08/27 14:23:04 WARN mapreduce.JobSubmitter: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
14/08/27 14:23:04 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-otlam/mapred/staging/otlam968414084/.staging/job_local968414084_0001
Exception in thread "main" java.util.NoSuchElementException
at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
...

This happens when I try to run job.watForCompletion(true); on a new MapReduce-job. The NoSuchElementException should be thrown, because there where no more Elements in a StringTokenizer and next() was called on it. I took a look into the source and found the following codepart in RawLocalFileSystem.java:

/// loads permissions, owner, and group from `ls -ld`
private void loadPermissionInfo() {
  IOException e = null;
  try {
    String output = FileUtil.execCommand(new File(getPath().toUri()), 
        Shell.getGetPermissionCommand());
    StringTokenizer t =
        new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
    //expected format
    //-rw-------    1 username groupname ...
    String permission = t.nextToken();

As far as I can see Hadoop tries to find out some permissions on a specific file with ls -ld which works perfectly if I use it in console. Unfortunately I havn't found out yet, which files permissions it was looking for.

The Hadoop version is 2.4.1 and the HBase version is 0.98.4 and I am using the Java-API. Other operations like creating a table work fine. Did anyone experience similar problems or knows what to do?

EDIT: I just found out that this is a just hadoop related issue. Making the simplest MapReduce-Operation even without using the HDFS gives me the same exception.

解决方案

Can you please check if this can solve your problem.

If yours is a permission issue, then this works.

public static void main(String[] args) {
     //set user group information       
     UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hdfs");
     //set privilege exception
     ugi.doAs(new PrivilegedExceptionAction<Void>() {
     public Void run() throws Exception {
                //create configuration object
                 Configuration config = new Configuration();
                 config.set("fs.defaultFS", "hdfs://ip:port/");
                 config.set("hadoop.job.ugi", "hdfs");
                 FileSystem dfs = FileSystem.get(config);
                 .
                 .

这篇关于Hadoop MapReduce NoSuchElementException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆