具有JSON模式的Kafka JDBC Sink连接器不起作用 [英] Kafka jdbc sink connector with json schema not working
本文介绍了具有JSON模式的Kafka JDBC Sink连接器不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
使用最新的kafka和融合的jdbc接收器连接器.发送一个非常简单的Json消息:
Using the latest kafka and confluent jdbc sink connectors. Sending a really simple Json message:
{
"schema": {
"type": "struct",
"fields": [
{
"type": "int",
"optional": false,
"field": "id"
},
{
"type": "string",
"optional": true,
"field": "msg"
}
],
"optional": false,
"name": "msgschema"
},
"payload": {
"id": 222,
"msg": "hi"
}
}
但是出现错误:
org.apache.kafka.connect.errors.DataException: JsonConverter with schemas.enable requires "schema" and "payload" fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.
Jsonlint说Json有效.我将json schemas.enable=true
保留在kafka配置中.有指针吗?
Jsonlint says the Json is valid. I have kept json schemas.enable=true
in kafka configuration. Any pointers?
推荐答案
您需要告诉Connect您的架构已嵌入在您使用的JSON中.
You need to tell Connect that your schema is embedded in the JSON you're using.
您有:
value.converter=org.apache.kafka.connect.json.JsonConverter
但还需要:
value.converter.schemas.enable=true
这篇关于具有JSON模式的Kafka JDBC Sink连接器不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文