不使用JobConf运行Hadoop作业 [英] Run Hadoop job without using JobConf
本文介绍了不使用JobConf运行Hadoop作业的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我无法找到提交不使用已弃用的 JobConf
类的Hadoop作业的单个示例。 JobClient
尚未被弃用,它仍然只支持采用 JobConf
参数的方法。
I can't find a single example of submitting a Hadoop job that does not use the deprecated JobConf
class. JobClient
, which hasn't been deprecated, still only supports methods that take a JobConf
parameter.
有人可以请我指出一个Java代码的例子,它只使用 Configuration
类提交Hadoop map / reduce作业(不是 JobConf
),并使用 mapreduce.lib.input
包而不是 mapred.input
Can someone please point me at an example of Java code submitting a Hadoop map/reduce job using only the Configuration
class (not JobConf
), and using the mapreduce.lib.input
package instead of mapred.input
?
推荐答案
希望这可以帮到您!
Hope this helpful
import java.io.File;
import org.apache.commons.io.FileUtils;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class MapReduceExample extends Configured implements Tool {
static class MyMapper extends Mapper<LongWritable, Text, LongWritable, Text> {
public MyMapper(){
}
protected void map(
LongWritable key,
Text value,
org.apache.hadoop.mapreduce.Mapper<LongWritable, Text, LongWritable, Text>.Context context)
throws java.io.IOException, InterruptedException {
context.getCounter("mygroup", "jeff").increment(1);
context.write(key, value);
};
}
@Override
public int run(String[] args) throws Exception {
Job job = new Job();
job.setMapperClass(MyMapper.class);
FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.waitForCompletion(true);
return 0;
}
public static void main(String[] args) throws Exception {
FileUtils.deleteDirectory(new File("data/output"));
args = new String[] { "data/input", "data/output" };
ToolRunner.run(new MapReduceExample(), args);
}
}
这篇关于不使用JobConf运行Hadoop作业的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文