Hadoop:LongWritable不能转换为org.apache.hadoop.io.IntWritable [英] Hadoop: LongWritable cannot be cast to org.apache.hadoop.io.IntWritable

查看:2606
本文介绍了Hadoop:LongWritable不能转换为org.apache.hadoop.io.IntWritable的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想获取输入文件中给定温度的平均值,我的Mapper和Reducer synatax对我来说似乎很好,但我仍然收到以下错误:

 无法从SCDynamicStore加载领域信息
13/02/17 08:03:28信息mapred.JobClient:任务ID:attempt_201302170552_0009_m_000000_1,状态:FAILED
java .lang.ClassCastException:org.apache.hadoop.io.LongWritable无法转换为org.apache.hadoop.io.IntWritable
,位于org.apache.hadoop.examples.TempMeasurement $ TempMapper.map(TempMeasurement.java: 26)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child $ 4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native方法)
,位于javax.security.auth.Subject.doAs(Subject .java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249 )

My Mapper函数是这样的:

抛出IOException,InterruptedException {

//获取日期和温度的代码

String temp = columns.get(3);
context.write(new IntWritable(year),new FloatWritable(Float.valueOf(temp)));
}
}



和Reduce是:

  public static class IntSumReducer 
extends Reducer< IntWritable,FloatWritable,IntWritable,FloatWritable> {
private FloatWritable result = new FloatWritable();

public void reduce(IntWritable key,Iterable< FloatWritable> values,
Context context
)throws IOException,InterruptedException {

//创建代码计算

context.write(key,result);


输入文件为:

  11111,0,19900101,44.04,
11112,0 19900102,50.00,
11113,3 19910203,30.00,

任何帮助将不胜感激

解决方案映射文本文件的映射器的关键类始终是 LongWritable 。这是因为它包含了当前行的字节偏移量,这很容易溢出一个整数。



基本上你需要将你的代码改为:

  public static class TempMapper扩展Mapper< LongWritable,Text,IntWritable,FloatWritable> {

@Override
保护void map(LongWritable key,Text value,Context context)
throws IOException,InterruptedException {
//获取日期和温度的代码
String temp = columns.get(3);
context.write(new IntWritable(year),new FloatWritable(Float.valueOf(temp)));
}
}


I want to take a mean value of a temperature given in an input file and my Mapper and Reducer synatax seems fine to me but I am still getting the following error:

 Unable to load realm info from SCDynamicStore
    13/02/17 08:03:28 INFO mapred.JobClient: Task Id : attempt_201302170552_0009_m_000000_1, Status : FAILED
    java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
        at org.apache.hadoop.examples.TempMeasurement$TempMapper.map(TempMeasurement.java:26)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

My Mapper function is this:

public static class TempMapper extends Mapper<IntWritable, Text, IntWritable, FloatWritable>{

@Override
protected void map(IntWritable key, Text value, Context context)
                throws IOException, InterruptedException {

    //code for getting date and temperature

    String temp = columns.get(3);
    context.write(new IntWritable(year), new FloatWritable(Float.valueOf(temp)));
}
}

And Reduce is:

  public static class IntSumReducer
       extends Reducer<IntWritable, FloatWritable, IntWritable ,FloatWritable> {
    private FloatWritable result = new FloatWritable();

    public void reduce(IntWritable key, Iterable<FloatWritable> values,
                       Context context
                       ) throws IOException, InterruptedException {

      //code for making calculations    

      context.write(key, result);
    }
  }

Input file is as:

11111 , 0,19900101, 44.04 ,
11112, 0, 19900102, 50.00,
11113, 3, 19910203, 30.00,

Any help would be appreciated

解决方案

The key class of a mapper that maps text files is always LongWritable. That is because it contains the byte offset of the current line and this could easily overflow an integer.

Basically you need to change your code to this:

public static class TempMapper extends Mapper<LongWritable, Text, IntWritable, FloatWritable>{

  @Override
  protected void map(LongWritable key, Text value, Context context)
                throws IOException, InterruptedException {
       //code for getting date and temperature
       String temp = columns.get(3);
       context.write(new IntWritable(year), new FloatWritable(Float.valueOf(temp)));
  }
}

这篇关于Hadoop:LongWritable不能转换为org.apache.hadoop.io.IntWritable的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆