Spark Sql 查询失败 [英] Spark Sql query fails

查看:34
本文介绍了Spark Sql 查询失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用 Sparks 2/java/Cas​​sanda2.2尝试运行一个简单的 sparks sql 查询,它出错:尝试如下,+ 变体,如'LAX'",和 '=' 而不是 '=='.

Using Sparks 2/java/Cassanda2.2 Trying to run a simple sparks sql query, it errors: Tried as below, + variations like "'LAX'", and '=' instead of '=='.

Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`LAX`' given input columns: [transdate, origin]; line 1 pos 42;
'Project ['origin]
+- 'Filter (origin#1 = 'LAX)
   +- SubqueryAlias origins
      +- LogicalRDD [transdate#0, origin#1]

JavaRDD<TransByDate> originDateRDD = javaFunctions(sc).cassandraTable("trans", "trans_by_date", CassandraJavaUtil.mapRowTo(TransByDate.class)).select(CassandraJavaUtil.column("origin"), CassandraJavaUtil.column("trans_date").as("transdate"));

long cnt1= originDateRDD.count();
System.out.println("sqlLike originDateRDD.count: "+cnt1); --> 406000
Dataset<Row> originDF = sparks.createDataFrame(originDateRDD, TransByDate.class);
originDF.createOrReplaceTempView("origins");
Dataset<Row> originlike = sparks.sql("SELECT origin FROM origins WHERE origin =="+ "LAX");

我已启用 Hive 支持(如果有帮助)谢谢

I have enabled Hive Support (if that helps) Thanks

推荐答案

将列值放在单引号内.您的查询应如下所示.

Put the column value inside single quote. Your query should look like below.

Dataset<Row> originlike = spark.sql("SELECT origin FROM origins WHERE origin == "+"'LAX'");

您可以参考在Java中使用Spark SQL查询Cassandra数据 了解更多详情.

You can refer Querying Cassandra data using Spark SQL in Java for more detail.

Like 查询应如下所示.

Like query should be like below.

Dataset<Row> originlike = spark.sql("SELECT origin FROM origins WHERE origin like 'LA%'");

这篇关于Spark Sql 查询失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆