在Java Spark中将RDD转换为数据集 [英] convert RDD to Dataset in Java Spark
问题描述
我有一个RDD,我需要将其转换为数据集,我尝试过:
I have an RDD, i need to convert it into a Dataset, i tried:
Dataset<Person> personDS = sqlContext.createDataset(personRDD, Encoders.bean(Person.class));
上面的行抛出错误,
无法解析方法createDataset(org.apache.spark.api.java.JavaRDD Main.Person,org.apache.spark.sql.Encoder T)
cannot resolve method createDataset(org.apache.spark.api.java.JavaRDD Main.Person, org.apache.spark.sql.Encoder T)
但是,我可以在转换为Dataframe
之后转换为Dataset
.下面的代码有效:
however, i can convert to Dataset
after converting to Dataframe
. the below code works:
Dataset<Row> personDF = sqlContext.createDataFrame(personRDD, Person.class);
Dataset<Person> personDS = personDF.as(Encoders.bean(Person.class));
推荐答案
.createDataset()
接受RDD<T>
而不是JavaRDD<T>
. JavaRDD
是RDD的包装,以简化来自Java代码的调用.它内部包含RDD,可以使用.rdd()
访问.下面可以创建一个Dataset
:
.createDataset()
accepts RDD<T>
not JavaRDD<T>
. JavaRDD
is a wrapper around RDD inorder to make calls from java code easier. It contains RDD internally and can be accessed using .rdd()
. The following can create a Dataset
:
Dataset<Person> personDS = sqlContext.createDataset(personRDD.rdd(), Encoders.bean(Person.class));
这篇关于在Java Spark中将RDD转换为数据集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!