如何将 from_json 与 Kafka connect 0.10 和 Spark Structured Streaming 一起使用? [英] How to use from_json with Kafka connect 0.10 and Spark Structured Streaming?
问题描述
我试图从 [Databricks][1] 复制示例并将其应用到 Kafka 的新连接器并激发结构化流,但是我无法使用 Spark 中的开箱即用方法正确解析 JSON...
I was trying to reproduce the example from [Databricks][1] and apply it to the new connector to Kafka and spark structured streaming however I cannot parse the JSON correctly using the out-of-the-box methods in Spark...
注意:主题以JSON格式写入Kafka.
note: the topic is written into Kafka in JSON format.
val ds1 = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", IP + ":9092")
.option("zookeeper.connect", IP + ":2181")
.option("subscribe", TOPIC)
.option("startingOffsets", "earliest")
.option("max.poll.records", 10)
.option("failOnDataLoss", false)
.load()
下面的代码不行,我相信是因为json列是字符串,与方法from_json签名不匹配...
The following code won't work, I believe that's because the column json is a string and does not match the method from_json signature...
val df = ds1.select($"value" cast "string" as "json")
.select(from_json("json") as "data")
.select("data.*")
有什么建议吗?
推荐答案
首先,您需要为 JSON 消息定义架构.例如
First you need to define the schema for your JSON message. For example
val schema = new StructType()
.add($"id".string)
.add($"name".string)
现在您可以在 from_json
方法中使用此架构,如下所示.
Now you can use this schema in from_json
method like below.
val df = ds1.select($"value" cast "string" as "json")
.select(from_json($"json", schema) as "data")
.select("data.*")
这篇关于如何将 from_json 与 Kafka connect 0.10 和 Spark Structured Streaming 一起使用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!