类型不匹配:无法从Iterator< String>转换为在Java Spark中 [英] Type mismatch: cannot convert from Iterator<String> in Java Spark
本文介绍了类型不匹配:无法从Iterator< String>转换为在Java Spark中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
线程"main"中的异常java.lang.Error:未解决的编译问题: 类型不匹配:无法从Iterator转换为Iterable
Exception in thread "main" java.lang.Error: Unresolved compilation problem: Type mismatch: cannot convert from Iterator to Iterable
在com.spark.wordcount.lession1.WordCount2.main(WordCount2.java:26)
at com.spark.wordcount.lession1.WordCount2.main(WordCount2.java:26)
SparkConf conf = new SparkConf().setAppName("cust data").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("C:\\\\Users\\\\dell\\\\Desktop\\\\simple_text_file.txt");
JavaRDD<String> words = lines.flatMap(s -> Arrays.asList(SPACE.split(s)).iterator());
JavaPairRDD<String, Integer> ones = words.mapToPair(s -> new Tuple2<>(s, 1));
JavaPairRDD<String, Integer> counts = ones.reduceByKey((i1, i2) -> i1 + i2);
List<Tuple2<String, Integer>> output = counts.collect();
for (Tuple2<?,?> tuple : output) {
System.out.println(tuple._1() + ": " + tuple._2());
}
推荐答案
您正在混合不兼容版本的Spark/代码:
You are mixing incompatible versions of Spark / code:
- 在Spark 2.x中
FlatMapFunction.call
是Iterable<R> call(T t)
.
您应该将Spark依赖项升级到2.x并保留当前代码,或者使用FlatMapFunction
You should either upgrade Spark dependency to 2.x and keep your current code or use FlatMapFunction
compatible with 1.x branch:
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
@Override
public Iterable<String> call(String s) {
return Arrays.asList(SPACE.split(s));
}
});
这篇关于类型不匹配:无法从Iterator< String>转换为在Java Spark中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文