如何使用TwoDArrayWritable从映射器发射2D双精度数组 [英] How to emit 2D double array from mapper using TwoDArrayWritable
本文介绍了如何使用TwoDArrayWritable从映射器发射2D双精度数组的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想用 TwoDArrayWritable
作为值发出一个2D double数组。
如何写<$> c $ c> context.write(key,)
编辑
并且在 Reducer
如何获得
它们在一个二维双数组中并且 print
值。
I 在
中写入
Mapper
row = E.length;
col = E [0] .length;
TwoDArrayWritable array = new TwoDArrayWritable(DoubleWritable.class);
DoubleWritable [] [] myInnerArray = new DoubleWritable [row] [col];
//设置myInnerArray中的值
为(int k1 = 0; k1
for(int j1 = 0; j1 myInnerArray [k1] [j1] = new DoubleWritable(E [k1] [j1]);
}
array.set(myInnerArray);
context.write(clusterNumber,array);
但显示 Nullpointer异常
13/11/01 16:34:07信息mapred.LocalJobRunner:地图任务执行器完成。
13/11/01 16时34分07秒WARN mapred.LocalJobRunner:job_local724758890_0001
java.lang.Exception的:显示java.lang.NullPointerException
。在org.apache.hadoop.mapred.LocalJobRunner $ Job.run(LocalJobRunner.java:404)
引起:java.lang.NullPointerException
at org.apache.hadoop.io.TwoDArrayWritable.write(TwoDArrayWritable.java:91)
at org.apache.hadoop.io.serializer.WritableSerialization $ WritableSerializer.serialize(WritableSerialization.java:100)
在org.apache.hadoop.io.serializer.WritableSerialization $ WritableSerializer.serialize(WritableSerialization.java:84)
。在org.apache.hadoop.mapred.MapTask $ MapOutputBuffer.collect(MapTask.java:945)
在org.apache.hadoop.mapred.MapTask $ NewOutputCollector.write(MapTask.java:601)
。在org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
在org.apache.hadoop.mapreduce.lib.map.WrappedMapper $ Context.write(WrappedMapper.java :106)
a t edu.Mapper.map(Mapper.java:277)
Mapper.java:277:context.write(clusterNumber,array);
解决方案
TwoDArrayWritable array = new TwoDArrayWritable(DoubleWritable.class);
DoubleWritable [] [] myInnerArray = new DoubleWritable [10] [];
//在myInnerArray中设置值
array.set(myInnerArray);
context.write(key,array);
我相信NPE是因为数组的某个元素为null。从你发布的代码,我想知道 ecol
等于 col
。
I want to emit a 2D double array using TwoDArrayWritable
as value .
how to write the context.write(key , )
EDIT
And in Reducer
how to get
them in a Two Dimensional double array and print
the values.
I Wrote
in Mapper
row = E.length;
col = E[0].length;
TwoDArrayWritable array = new TwoDArrayWritable (DoubleWritable.class);
DoubleWritable[][] myInnerArray = new DoubleWritable[row][col];
// set values in myInnerArray
for (int k1 = 0; k1 < row; k1++) {
for(int j1=0;j1< col;j1++){
myInnerArray[k1][j1] = new DoubleWritable(E[k1][j1]);
}
array.set(myInnerArray);
context.write(clusterNumber, array);
But showing a Nullpointer exception
13/11/01 16:34:07 INFO mapred.LocalJobRunner: Map task executor complete.
13/11/01 16:34:07 WARN mapred.LocalJobRunner: job_local724758890_0001
java.lang.Exception: java.lang.NullPointerException
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.NullPointerException
at org.apache.hadoop.io.TwoDArrayWritable.write(TwoDArrayWritable.java:91)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:945)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:601)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at edu.Mapper.map(Mapper.java:277)
Mapper.java:277 : context.write(clusterNumber, array);
解决方案
TwoDArrayWritable array = new TwoDArrayWritable (DoubleWritable.class);
DoubleWritable[][] myInnerArray = new DoubleWritable[10][];
// set values in myInnerArray
array.set(myInnerArray);
context.write(key, array);
I believe the NPE is because some element of the array is null. From the code you posted I wonder does ecol
equal col
.
这篇关于如何使用TwoDArrayWritable从映射器发射2D双精度数组的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文