为什么 from_json 失败并显示“未找到:值 from_json"? [英] Why does from_json fail with "not found : value from_json"?
问题描述
我正在使用 Spark 2.1.1 (kafka 0.10+) 阅读 Kafka 主题,有效负载是 JSON 字符串.我想用架构解析字符串并推进业务逻辑.
I am reading a Kafka topic using Spark 2.1.1 (kafka 0.10+) and the payload is a JSON string. I'd like to parse the string with a schema and move forward with business logic.
似乎每个人都建议我应该使用 from_json
来解析 JSON 字符串,但是,它似乎不适合我的情况.错误是
Everyone seems to suggest that I should use from_json
to parse the JSON strings, however, it doesn't seem to compile for my situation. The error being
not found : value from_json
.select(from_json($"json", txnSchema) as "data")
当我在 spark shell 中尝试以下几行时,它工作得很好 -
When i tried the following lines into spark shell, it works just fine -
val df = stream
.select($"value" cast "string" as "json")
.select(from_json($"json", txnSchema) as "data")
.select("data.*")
任何想法,我在代码中做错了什么才能看到这部分在 shell 中运行但在 IDE/编译时不起作用?
Any idea, what could I be doing wrong in the code to see this piece working in shell but not in IDE/compile time?
代码如下:
import org.apache.spark.sql._
object Kafka10Cons3 extends App {
val spark = SparkSession
.builder
.appName(Util.getProperty("AppName"))
.master(Util.getProperty("spark.master"))
.getOrCreate
val stream = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", Util.getProperty("kafka10.broker"))
.option("subscribe", src_topic)
.load
val txnSchema = Util.getTxnStructure
val df = stream
.select($"value" cast "string" as "json")
.select(from_json($"json", txnSchema) as "data")
.select("data.*")
}
推荐答案
您可能只是缺少相关的导入 - import org.apache.spark.sql.functions._
.
you're probably just missing the relevant import - import org.apache.spark.sql.functions._
.
您已经导入了 spark.implicits._
和 org.apache.spark.sql._
,但是这些都不会导入 functions 中的单个函数代码>.
You have imported spark.implicits._
and org.apache.spark.sql._
, but none of these would import the individual function in functions
.
我还导入了 com.wizzardo.tools.json
看起来它也有一个 from_json
函数,它一定是编译器选择的函数(因为它是先导入的?),这显然与我的 spark 版本不兼容
I was also importing
com.wizzardo.tools.json
which looks like it also has afrom_json
function, which must have been the one the compiler chose (since it was imported first?) and which was apparently incompatible with my version of spark
确保您没有从其他 json 库中导入 from_json
函数,因为该库可能与您使用的 spark 版本不兼容.
Make sure you are not importing the from_json
function from some other json library, as this library may be incompatible with the version of spark you are using.
这篇关于为什么 from_json 失败并显示“未找到:值 from_json"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!