如何最有效地将 Scala DataFrame 的 Row 转换为 case 类? [英] How to convert Row of a Scala DataFrame into case class most efficiently?

查看:69
本文介绍了如何最有效地将 Scala DataFrame 的 Row 转换为 case 类?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在 Spark 中获得某个 Row 类(Dataframe 或 Catalyst)后,我想在我的代码中将其转换为 case 类.这可以通过匹配来完成

Once I have got in Spark some Row class, either Dataframe or Catalyst, I want to convert it to a case class in my code. This can be done by matching

someRow match {case Row(a:Long,b:String,c:Double) => myCaseClass(a,b,c)}

但是当行有大量的列时,它会变得丑陋,比如十几个双精度数,一些布尔值,甚至偶尔的空值.

But it becomes ugly when the row has a huge number of columns, say a dozen of Doubles, some Booleans and even the occasional null.

我只想能够 - 抱歉 - 将 Row 转换为 myCaseClass.有没有可能,或者我已经得到了最经济的语法?

I would like just to be able to -sorry- cast Row to myCaseClass. Is it possible, or have I already got the most economical syntax?

推荐答案

DataFrame 只是 Dataset[Row] 的类型别名.与强类型 Scala/Java 数据集附带的类型转换"相比,这些操作也称为无类型转换".

DataFrame is simply a type alias of Dataset[Row] . These operations are also referred as "untyped transformations" in contrast to "typed transformations" that come with strongly typed Scala/Java Datasets.

spark中Dataset[Row]到Dataset[Person]的转换非常简单

The conversion from Dataset[Row] to Dataset[Person] is very simple in spark

val DFtoProcess = SQLContext.sql("SELECT * FROM peoples WHERE name='test'")

此时,Spark 将您的数据转换为 DataFrame = Dataset[Row],这是一个通用 Row 对象的集合,因为它不知道确切的类型.

At this point, Spark converts your data into DataFrame = Dataset[Row], a collection of generic Row object, since it does not know the exact type.

// Create an Encoders for Java class (In my eg. Person is a JAVA class)
// For scala case class you can pass Person without .class reference
val personEncoder = Encoders.bean(Person.class) 

val DStoProcess = DFtoProcess.as[Person](personEncoder)

现在,Spark 转换 Dataset[Row] ->Dataset[Person] 特定于类型的 Scala/Java JVM 对象,由类 Person 指示.

Now, Spark converts the Dataset[Row] -> Dataset[Person] type-specific Scala / Java JVM object, as dictated by the class Person.

详情请参考下面由databricks提供的链接

Please refer to below link provided by databricks for further details

https://databricks.com/blog/2016/07/14/a-tale-of-three-apache-spark-apis-rdds-dataframes-and-datasets.html

这篇关于如何最有效地将 Scala DataFrame 的 Row 转换为 case 类?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆