将 Spark DataFrame 转换为 Pojo 对象 [英] Convert Spark DataFrame to Pojo Object
问题描述
请看下面的代码:
//Create Spark Context
SparkConf sparkConf = new SparkConf().setAppName("TestWithObjects").setMaster("local");
JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
//Creating RDD
JavaRDD<Person> personsRDD = javaSparkContext.parallelize(persons);
//Creating SQL context
SQLContext sQLContext = new SQLContext(javaSparkContext);
DataFrame personDataFrame = sQLContext.createDataFrame(personsRDD, Person.class);
personDataFrame.show();
personDataFrame.printSchema();
personDataFrame.select("name").show();
personDataFrame.registerTempTable("peoples");
DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");
result.show();
在此之后,我需要将 DataFrame - 'result' 转换为 Person 对象或列表.提前致谢.
After this I need to convert the DataFrame - 'result' to Person Object or List. Thanks in advance.
推荐答案
DataFrame 只是 Dataset[Row] 的类型别名.与强类型 Scala/Java 数据集附带的类型转换"相比,这些操作也称为无类型转换".
DataFrame is simply a type alias of Dataset[Row] . These operations are also referred as "untyped transformations" in contrast to "typed transformations" that come with strongly typed Scala/Java Datasets.
spark中Dataset[Row]到Dataset[Person]的转换非常简单
The conversion from Dataset[Row] to Dataset[Person] is very simple in spark
DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");
此时,Spark 将您的数据转换为 DataFrame = Dataset[Row],这是一个通用 Row 对象的集合,因为它不知道确切的类型.
At this point, Spark converts your data into DataFrame = Dataset[Row], a collection of generic Row object, since it does not know the exact type.
// Create an Encoders for Java beans
Encoder<Person> personEncoder = Encoders.bean(Person.class);
Dataset<Person> personDF = result.as(personEncoder);
personDF.show();
现在,Spark 转换 Dataset[Row] -> Dataset[Person] 类型特定的 Scala/Java JVM 对象,如类 Person 所指示的那样.
Now, Spark converts the Dataset[Row] -> Dataset[Person] type-specific Scala / Java JVM object, as dictated by the class Person.
详情请参考下面由databricks提供的链接
Please refer to below link provided by databricks for further details
这篇关于将 Spark DataFrame 转换为 Pojo 对象的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!