Spark-将Map转换为单行DataFrame [英] Spark - convert Map to a single-row DataFrame

查看:663
本文介绍了Spark-将Map转换为单行DataFrame的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的应用程序中,我需要从Map创建一个单行DataFrame.

In my application I have a need to create a single-row DataFrame from a Map.

这样的地图就像

("col1" -> 5, "col2" -> 10, "col3" -> 6)

将被转换为只有一行的DataFrame,并且映射键将成为列的名称.

would be transformed into a DataFrame with a single row and the map keys would become names of columns.

col1 | col2 | col3
5    | 10   | 6

如果您想知道为什么要这样做-我只需要使用MongoSpark连接器将包含一些统计信息的单个文档保存到MongoDB中,该连接器就可以保存DF和RDD.

In case you are wondering why would I want this - I just need to save a single document with some statistics into MongoDB using MongoSpark connector which allows saving DFs and RDDs.

推荐答案

在这里:

val map: Map[String, Int] = Map("col1" -> 5, "col2" -> 6, "col3" -> 10)

val df = map.tail
  .foldLeft(Seq(map.head._2).toDF(map.head._1))((acc,curr) => acc.withColumn(curr._1,lit(curr._2)))


df.show()

+----+----+----+
|col1|col2|col3|
+----+----+----+
|   5|   6|  10|
+----+----+----+

这篇关于Spark-将Map转换为单行DataFrame的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆