在 map/reduce的开发中经常使用数组类型, 例如DoubleWritable ,当需要使用一组DoubleWritable的时候,
DoubleWritable[] doublewritable =new DoubleWritable[n]
但是DoubleWritable[] 这种形式的数组并不能作为value在map和reduce中传递. 实际上,map/reduce 的序列类型为ArrayWritable.
当我要使用DoubleWritable类型的数组作为value进行传递时,需要重写一个类继承于ArrayWritable
public class NewArrayWritable extends ArrayWritable{ public NewArrayWritable() { super(DoubleWritable.class); } }
NewArrayWritable ntemp=new NewArrayWritable();
ntemp.set(ctemp);
job.setMapOutputValueClass(IntWritable.class);
job.setMapOutputValueClass(NewArrayWritable.class);
注意: 此时程序可能出现wrong value class: class org.apache.hadoop.io.IntWritable is not class org.apache.hadoop.examples.NewArrayWritable
异常
解决方法: 注释掉主函数中job.setCombinerClass(IntSumReducer.class);
原文:http://blog.csdn.net/ltianchao/article/details/18555159