带有列Scala的Spark Select [英] Spark Select with a List of Columns Scala
问题描述
我试图找到一种使用List [Column进行火花选择]的好方法,而不是将我感兴趣的所有列都与爆炸列一起传递回爆炸列.
I am trying to find a good way of doing a spark select with a List[Column, I am exploding a column than passing back all the columns I am interested in with my exploded column.
var columns = getColumns(x) // Returns a List[Column]
tempDf.select(columns) //trying to get
我知道尝试找到一种很好的方法来做到这一点,如果它是字符串,我可以做类似的事情
Trying to find a good way of doing this I know, if it were a string I could do something like
val result = dataframe.select(columnNames.head, columnNames.tail: _*)
推荐答案
对于spark 2.0,似乎有两个选择.两者都取决于您如何管理列(字符串或列).
For spark 2.0 seems that you have two options. Both depends on how you manage your columns (Strings or Columns).
火花代码( spark-sql_2.11/org/apache/spark/sql/Dataset.scala ):
def select(cols: Column*): DataFrame = withPlan {
Project(cols.map(_.named), logicalPlan)
}
def select(col: String, cols: String*): DataFrame = select((col +: cols).map(Column(_)) : _*)
您可以看到spark在内部如何转换您的 头&尾巴
到列"列表以再次调用 选择
.
You can see how internally spark is converting your head & tail
to a list of Columns to call again Select
.
因此,在这种情况下,如果您想要一个清晰的代码,我会建议:
So, in that case if you want a clear code I will recommend:
如果列:列表[字符串] :
import org.apache.spark.sql.functions.col
df.select(columns.map(col): _*)
否则,如果列:列表[列] :
df.select(columns: _*)
这篇关于带有列Scala的Spark Select的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!