不能与Hadoop一起使用CompositeInputFormat,抛出异常Expression为null [英] Can't use CompositeInputFormat with Hadoop, throwing exception Expression is null
本文介绍了不能与Hadoop一起使用CompositeInputFormat,抛出异常Expression为null的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用CDH4(4.5)中的MRv1,并且遇到CompositeInputFormat
的问题.我尝试加入多少输入无关紧要.为了简单起见,下面的示例仅包含一个输入:
I'm using the MRv1 from CDH4 (4.5) and facing a problem with CompositeInputFormat
. It doesn't matter how many inputs I try to join. For the sake of simplicity, here's the example with just one input:
Configuration conf = new Configuration();
Job job = new Job(conf, "Blah");
job.setJarByClass(Blah.class);
job.setMapperClass(Blah.BlahMapper.class);
job.setReducerClass(Blah.BlahReducer.class);
job.setMapOutputKeyClass(LongWritable.class);
job.setMapOutputValueClass(BlahElement.class);
job.setOutputKeyClass(LongWritable.class);
job.setOutputValueClass(BlahElement.class);
job.setInputFormatClass(CompositeInputFormat.class);
String joinStatement = CompositeInputFormat.compose("inner", SequenceFileInputFormat.class, "/someinput");
System.out.println(joinStatement);
conf.set("mapred.join.expr", joinStatement);
job.setOutputFormatClass(SequenceFileOutputFormat.class);
FileOutputFormat.setOutputPath(job, new Path(newoutput));
return job.waitForCompletion(true) ? 0 : 1;
这是输出+ stacktrace:
Here's the output + stacktrace:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop2/share/hadoop/mapreduce1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop2/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
14/01/31 03:27:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
inner(tbl(org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat,"/someinput"))
14/01/31 03:27:48 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/01/31 03:27:51 INFO mapred.JobClient: Cleaning up the staging area hdfs://archangel-desktop:54310/tmp/hadoop/mapred/staging/hadoop/.staging/job_201401302213_0013
14/01/31 03:27:51 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop (auth:SIMPLE) cause:java.io.IOException: Expression is null
Exception in thread "main" java.io.IOException: Expression is null
at org.apache.hadoop.mapreduce.lib.join.Parser.parse(Parser.java:542)
at org.apache.hadoop.mapreduce.lib.join.CompositeInputFormat.setFormat(CompositeInputFormat.java:85)
at org.apache.hadoop.mapreduce.lib.join.CompositeInputFormat.getSplits(CompositeInputFormat.java:127)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1079)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1096)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:177)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:995)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:948)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:948)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
at com.nileshc.graphfu.pagerank.BlockMatVec.run(BlockMatVec.java:79)
at com.nileshc.graphfu.Main.main(Main.java:21)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
有人遇到过吗?有什么解决方法的想法吗?
Anyone ever faced this before? Any ideas on how to solve it?
推荐答案
我不好.
conf.set("mapred.join.expr", joinStatement);
以上应为:
job.getConfiguration().set("mapreduce.join.expr", joinStatement);
并且:
String joinStatement = CompositeInputFormat.compose("inner", SequenceFileInputFormat.class, "/someinput");
^^应该是:
String joinStatement = CompositeInputFormat.compose("inner", SequenceFileInputFormat.class, new Path("/someinput"));
第一个变化是使一切都不同的原因.
The first change is what made all the difference.
这篇关于不能与Hadoop一起使用CompositeInputFormat,抛出异常Expression为null的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文