适用于StructType/行的Spark UDF [英] Spark UDF for StructType / Row

查看:334
本文介绍了适用于StructType/行的Spark UDF的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在spark数据框中有一个"StructType"列,该列具有一个数组和一个字符串作为子字段.我想修改数组并返回相同类型的新列.我可以使用UDF处理它吗?还是有其他选择?

I have a "StructType" column in spark Dataframe that has an array and a string as sub-fields. I'd like to modify the array and return the new column of the same type. Can I process it with UDF? Or what are the alternatives?

import org.apache.spark.sql.types._
import org.apache.spark.sql.Row
val sub_schema = StructType(StructField("col1",ArrayType(IntegerType,false),true) :: StructField("col2",StringType,true)::Nil)
val schema = StructType(StructField("subtable", sub_schema,true) :: Nil)
val data = Seq(Row(Row(Array(1,2),"eb")),  Row(Row(Array(3,2,1), "dsf")) )
val rd = sc.parallelize(data)
val df = spark.createDataFrame(rd, schema)
df.printSchema

root
 |-- subtable: struct (nullable = true)
 |    |-- col1: array (nullable = true)
 |    |    |-- element: integer (containsNull = false)
 |    |-- col2: string (nullable = true)

似乎我需要Row类型的UDF,例如

It seems that I need a UDF of the type Row, something like

val u =  udf((x:Row) => x)
       >> Schema for type org.apache.spark.sql.Row is not supported

这很有意义,因为Spark不知道返回类型的架构. 不幸的是,udf.register也失败了:

This makes sense, since Spark does not know the schema for the return type. Unfortunately, udf.register fails too:

spark.udf.register("foo", (x:Row)=> Row, sub_schema)
     <console>:30: error: overloaded method value register with alternatives: ...

推荐答案

原来,您可以将结果模式作为第二个UDF参数传递:

turns out you can pass the result schema as a second UDF parameter:

val u =  udf((x:Row) => x, sub_schema)

这篇关于适用于StructType/行的Spark UDF的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆