为什么from_json失败并显示“未找到:值from_json"? [英] Why does from_json fail with "not found : value from_json"?
问题描述
我正在使用Spark 2.1.1(kafka 0.10+)阅读Kafka主题,并且有效负载是JSON字符串.我想使用模式解析字符串,并继续进行业务逻辑.
I am reading a Kafka topic using Spark 2.1.1 (kafka 0.10+) and the payload is a JSON string. I'd like to parse the string with a schema and move forward with business logic.
每个人似乎都建议我使用from_json
来解析JSON字符串,但是,它似乎无法针对我的情况进行编译.错误为
Everyone seems to suggest that I should use from_json
to parse the JSON strings, however, it doesn't seem to compile for my situation. The error being
not found : value from_json
.select(from_json($"json", txnSchema) as "data")
当我在Spark Shell中尝试以下代码时,它工作得很好-
When i tried the following lines into spark shell, it works just fine -
val df = stream
.select($"value" cast "string" as "json")
.select(from_json($"json", txnSchema) as "data")
.select("data.*")
任何想法,如果我看到这段代码在shell中有效但在IDE/编译时却无效,那么我在代码中会做错什么呢?
Any idea, what could I be doing wrong in the code to see this piece working in shell but not in IDE/compile time?
代码如下:
import org.apache.spark.sql._
object Kafka10Cons3 extends App {
val spark = SparkSession
.builder
.appName(Util.getProperty("AppName"))
.master(Util.getProperty("spark.master"))
.getOrCreate
val stream = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", Util.getProperty("kafka10.broker"))
.option("subscribe", src_topic)
.load
val txnSchema = Util.getTxnStructure
val df = stream
.select($"value" cast "string" as "json")
.select(from_json($"json", txnSchema) as "data")
.select("data.*")
}
推荐答案
您可能只是缺少相关的导入-import org.apache.spark.sql.functions._
.
you're probably just missing the relevant import - import org.apache.spark.sql.functions._
.
您已经导入了spark.implicits._
和org.apache.spark.sql._
,但是这些都不会导入functions
中的单个功能.
You have imported spark.implicits._
and org.apache.spark.sql._
, but none of these would import the individual function in functions
.
我也正在导入
com.wizzardo.tools.json
,它似乎也具有from_json
函数,该函数必须是编译器选择的函数(因为它是首先导入的?),并且显然与我的spark版本不兼容
I was also importing
com.wizzardo.tools.json
which looks like it also has afrom_json
function, which must have been the one the compiler chose (since it was imported first?) and which was apparently incompatible with my version of spark
请确保您没有从其他json库导入from_json
函数,因为该库可能与您使用的spark版本不兼容.
Make sure you are not importing the from_json
function from some other json library, as this library may be incompatible with the version of spark you are using.
这篇关于为什么from_json失败并显示“未找到:值from_json"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!