Scala Spark-Map(String,Int)的DataFrame列上的空地图 [英] Scala Spark - empty map on DataFrame column for map(String, Int)
问题描述
我正在加入两个DataFrame,其中有类型为Map[String, Int]
I am joining two DataFrames, where there are columns of a type Map[String, Int]
我希望合并的DF在Map
类型列上具有一个空的映射[]
而不是null
.
I want the merged DF to have an empty map []
and not null
on the Map
type columns.
val df = dfmerged.
.select("id"),
coalesce(col("map_1"), lit(null).cast(MapType(StringType, IntType))).alias("map_1"),
coalesce(col("map_2"), lit(Map.empty[String, Int])).alias("map_2")
对于map_1
列,将插入null
,但是我想拥有一个空地图
map_2给我一个错误:
for a map_1
column, a null
will be inserted, but I'd like to have an empty map
map_2 is giving me an error:
java.lang.RuntimeException:不支持的文字类型类 scala.collection.immutable.Map $ EmptyMap $ Map()
java.lang.RuntimeException: Unsupported literal type class scala.collection.immutable.Map$EmptyMap$ Map()
我也尝试过使用udf
函数,例如:
I've also tried with an udf
function like:
case class myStructMap(x:Map[String, Int])
val emptyMap = udf(() => myStructMap(Map.empty[String, Int]))
也没有用.
当我尝试类似的操作时:
when I try something like:
.select( coalesce(col("myMapCol"), lit(map())).alias("brand_viewed_count")...
或
.select(coalesce(col("myMapCol"), lit(map().cast(MapType(LongType, LongType)))).alias("brand_viewed_count")...
我得到了错误:
由于数据类型不匹配而无法解析"map()":无法投射 MapType(NullType,NullType,false)到MapType(LongType,IntType,true);
cannot resolve 'map()' due to data type mismatch: cannot cast MapType(NullType,NullType,false) to MapType(LongType,IntType,true);
推荐答案
在Spark 2.2中
In Spark 2.2
import org.apache.spark.sql.functions.typedLit
val df = Seq((1L, null), (2L, Map("foo" -> "bar"))).toDF("id", "map")
df.withColumn("map", coalesce($"map", typedLit(Map[String, Int]()))).show
// +---+-----------------+
// | id| map|
// +---+-----------------+
// | 1| Map()|
// | 2|Map(foobar -> 42)|
// +---+-----------------+
之前
df.withColumn("map", coalesce($"map", map().cast("map<string,int>"))).show
// +---+-----------------+
// | id| map|
// +---+-----------------+
// | 1| Map()|
// | 2|Map(foobar -> 42)|
// +---+-----------------+
这篇关于Scala Spark-Map(String,Int)的DataFrame列上的空地图的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!