Scala Spark:动态拆分数据框列 [英] Scala Spark: splitting dataframe column dynamically

查看:129
本文介绍了Scala Spark:动态拆分数据框列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对scala和spark非常陌生.

I am very new to scala and spark.

我已将文本文件读入数据框,并成功将单列拆分为列(基本上该文件是用空格分隔的csv)

I have read a text file into a dataframe, and successfully split the single column into columns (essentially the file is SPACE delimited csv)

  val irisDF:DataFrame = spark.read.csv("src/test/resources/iris-in.txt")

  irisDF.show()

  val dfnew:DataFrame = irisDF.withColumn("_tmp", split($"_c0", " ")).select(
    $"_tmp".getItem(0).as("col1"),
    $"_tmp".getItem(1).as("col2"),
    $"_tmp".getItem(2).as("col3"),
    $"_tmp".getItem(3).as("col4")
  ).drop("_tmp")

这有效.

但是,如果我不知道数据文件中有多少列,该怎么办?如何根据split函数生成的项目数动态生成列?

BUT what if I do not know how many columns there are in the datafile? How do I dynamically generate the columns depending on the number of items generated by the split function?

推荐答案

您可以创建一系列选择表达式,然后使用:_ * <将它们全部应用于 select 方法./code>语法:

You can create a sequence of select expressions, and then apply all of them to select method with :_* syntax:

示例数据:

val df = Seq("a b c d", "e f g").toDF("c0")

df.show
+-------+
|     c0|
+-------+
|a b c d|
|  e f g|
+-------+

如果要从 c0 列中选择五列,请在执行此操作之前确定这些列:

If you want five columns from the c0 column, which you need to determine before doing this:

val selectExprs = 0 until 5 map (i => $"temp".getItem(i).as(s"col$i"))

df.withColumn("temp", split($"c0", " ")).select(selectExprs:_*).show
+----+----+----+----+----+
|col0|col1|col2|col3|col4|
+----+----+----+----+----+
|   a|   b|   c|   d|null|
|   e|   f|   g|null|null|
+----+----+----+----+----+

这篇关于Scala Spark:动态拆分数据框列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆