从远程系统提交mapreduce作业时出现异常 [英] Exception while submitting a mapreduce job from remote system

查看:155
本文介绍了从远程系统提交mapreduce作业时出现异常的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

从远程系统提交mapreduce作业时出现异常


13/10/28 18:49:52错误安全性。 UserGroupInformation:PriviledgedActionException as:root cause:org.apache.hadoop.mapred.InvalidInputException:输入路径不存在:file:/ F:/ Workspaces / Test / Hadoop / test


我的hadoop和mapreduce环境是在linux机器上配置的。我从本地Windows PC提交wordcount作业如下:

  public static void main(String [] args)throws Exception { 

UserGroupInformation ugi = UserGroupInformation.createRemoteUser(root);
$ b $ try {
ugi.doAs(new PrivilegedExceptionAction< Void>(){

public void run()throws Exception {

JobConf conf = new JobConf(MapReduce.class);
conf.set(mapred.job.name,MyApp);
conf.set(mapred.job.tracker,192.168 .1.149:9001);
conf.set(fs.default.name,hdfs://192.168.1.149:9000);
conf.set(hadoop.job.ugi ,root);

conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(IntWritable.class);

conf.setMapperClass(Map .class);
conf.setCombinerClass(Reduce.class);
conf.setReducerClass(Reduce.class);

conf.setInputFormat(TextInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);

FileInputFormat.setInputPaths(conf,new Pa个( 测试));
FileOutputFormat.setOutputPath(conf,new Path(test));

JobClient.runJob(conf);

返回null;
}
});
} catch(Exception e){
e.printStackTrace();
}
}

其中192.168.1.149是hadoop配置的linux pc。我在那里开始了hadoop,mapreduce服务。同样 test 目录也是使用相同的Java API创建的,它工作正常。但是mapreduce没有。



**请帮忙.. **

解决方案

实际上这是我的配置错误:



我错过了 mapred-site.xml 中的 mapred.local.dir >

  



<属性>
<名称> mapred.local.dir< / name>
<值> /usr/local/hadoop-1.0.3/local< /值>
< / property>


I got an exception while submitting a mapreduce job from remote system

13/10/28 18:49:52 ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/F:/Workspaces/Test/Hadoop/test

My hadoop and mapreduce envirnment is configured on a linux machine. I submit the wordcount job from a local Windows PC as follows:

public static void main(String[] args) throws Exception {

    UserGroupInformation ugi = UserGroupInformation.createRemoteUser("root");

    try {
        ugi.doAs(new PrivilegedExceptionAction<Void>() {

            public Void run() throws Exception {

                JobConf conf = new JobConf(MapReduce.class);
                conf.set("mapred.job.name", "MyApp");
                conf.set("mapred.job.tracker", "192.168.1.149:9001");
                conf.set("fs.default.name","hdfs://192.168.1.149:9000");
                conf.set("hadoop.job.ugi", "root");

                conf.setOutputKeyClass(Text.class);
                conf.setOutputValueClass(IntWritable.class);

                conf.setMapperClass(Map.class);
                conf.setCombinerClass(Reduce.class);
                conf.setReducerClass(Reduce.class);

                conf.setInputFormat(TextInputFormat.class);
                conf.setOutputFormat(TextOutputFormat.class);

                FileInputFormat.setInputPaths(conf, new Path("test"));
                FileOutputFormat.setOutputPath(conf, new Path("test"));

                JobClient.runJob(conf);

                return null;
            }
        });
    } catch (Exception e) {
        e.printStackTrace();
    }
}

where 192.168.1.149 is the hadoop configured linux pc. I started hadoop, mapreduce services there. Also test directory was also created with same java API, it worked. But mapreduce not.

**Please help .. **

解决方案

Actually it was my configuration mistake:

I missed mapred.local.dir property in mapred-site.xml

 

<property> <name>mapred.local.dir</name> <value>/usr/local/hadoop-1.0.3/local</value> </property>

这篇关于从远程系统提交mapreduce作业时出现异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆