如何动态填充数据框的选择子句?给出AnalysisException [英] how to populate select clause of dataframe dynamically? giving AnalysisException
本文介绍了如何动态填充数据框的选择子句?给出AnalysisException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用spark-sql 2.4.1和Java 8.
I am Using spark-sql 2.4.1 and java 8.
val country_df = Seq(
("us",2001),
("fr",2002),
("jp",2002),
("in",2001),
("fr",2003),
("jp",2002),
("in",2003)
).toDF("country","data_yr")
> val col_df = country_df.select("country").where($"data_yr" === 2001)
val data_df = Seq(
("us_state_1","fr_state_1" ,"in_state_1","jp_state_1"),
("us_state_2","fr_state_2" ,"in_state_2","jp_state_1"),
("us_state_3","fr_state_3" ,"in_state_3","jp_state_1")
).toDF("us","fr","in","jp")
> data_df.select("us","in").show()
如何从给定年份的country_df中动态填充(data_df的)select子句?
how to populate this select clause (of data_df) dynamically , from the country_df for given year ?
即从第一个数据帧,我将获得column的值,这些是 我需要从第二个datafame中选择的列.怎么会这样 完成了吗?
i.e. From first dataframe , i will get values of column , those are the columns i need to select from second datafame. How can this be done ?
尝试过:
List<String> aa = col_df.select(functions.lower(col("data_item_code"))).map(row -> row.mkString(" ",", "," "), Encoders.STRING()).collectAsList();
data_df.select(aa.stream().map(s -> new Column(s)).toArray(Column[]::new));
错误:
.AnalysisException: cannot resolve '` un `' given input columns: [abc,.....all columns ...]
那么这里有什么问题,以及如何解决?
So what is wrong here , and how to fix this ?
推荐答案
您可以尝试以下代码.
从第一个数据集中选择列名.
Select the column name from the first dataset.
List<String> columns = country_df.select("country").where($"data_yr" === 2001).as(Encoders.STRING()).collectAsList();
在第二个数据集中的selectexpr中使用列名.
Use the column names in selectexpr in second dataset.
public static Seq<String> convertListToSeq(List<String> inputList) {
return JavaConverters.asScalaIteratorConverter(inputList.iterator()).asScala().toSeq();
}
//using selectExpr
data_df.selectExpr(convertListToSeq(columns)).show(true);
这篇关于如何动态填充数据框的选择子句?给出AnalysisException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文