kafka jdbc sink 连接器独立错误 [英] kafka jdbc sink connector standalone error

查看:46
本文介绍了kafka jdbc sink 连接器独立错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将数据从 kafka 中的主题插入 postgres 数据库.我正在使用以下命令加载

I am trying to insert data into a postgres database from a topic in kafka. I am using the following command to load

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-quickstart-mysql.properties

sink-quickstart-mysql.properties如下

The sink-quickstart-mysql.properties is as follows

name=test-sink-mysql-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=third_topic
connection.url=jdbc:postgres://localhost:5432/postgres
connection.user=postgres
connection.password=postgres
auto.create=true

我得到的错误是

[2019-01-29 13:16:48,859] ERROR Failed to create job for /home/ashley/confluent-5.1.0/etc/kafka-connect-jdbc/sink-quickstart-mysql.properties (org.apache.kafka.connect.cli.ConnectStandalone:102) [2019-01-29 13:16:48,862] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113) java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}     at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)  at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)     at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110) Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}   at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:179)    at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)    at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)     at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)   at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107) [2019-01-29 13:16:48,886] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65) [2019-01-29 13:16:48,886] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223) [2019-01-29 13:16:48,894] INFO Stopped http_8083@dc4fee1{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:341) [2019-01-29 13:16:48,895] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167) [2019-01-29 13:16:48,930] INFO Stopped o.e.j.s.ServletContextHandler@3c46dcbe{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:1040) [2019-01-29 13:16:48,943] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:241) [2019-01-29 13:16:48,943] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95) [2019-01-29 13:16:48,944] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:184) [2019-01-29 13:16:48,944] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66) [2019-01-29 13:16:48,947] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:205) [2019-01-29 13:16:48,950] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:112) [2019-01-29 13:16:48,951] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)

Postgres jar 文件已在文件夹中.有人可以建议吗?

Postgres jar file is already there in the folder. Can someone advise?

推荐答案

这几行是你日志中最重要的:

This lines are the most important from your log:

java.util.concurrent.ExecutionException:org.apache.kafka.connect.errors.ConnectException:找不到任何实现连接器且名称匹配的类io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器是:...

java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are:...

看来,您没有安装 kafka-connect-jdbc 连接器

It seems, that you didn't install kafka-connect-jdbc connector

检查 etc/schema-registry/connect-avro-standalone.properties 中的 plugin.path 属性并确保 plugin.路径 未注释.

Check your plugin.path property in etc/schema-registry/connect-avro-standalone.properties and ensure that the line for plugin.path is uncommented.

如果不使用 Confluent Platform,则需要在 plugin.path 目录下创建 jdbc 插件的另一个目录:例如.kafka-connect-jdbc 并将所有需要的罐子放在那里.kafka-connect-jdbc-5.1.0.jar、它的依赖和你的 jdbc 驱动.

If not using Confluent Platform, you will need to create under that plugin.path directory, another directory for the jdbc plugin: ex. kafka-connect-jdbc and put all needed jars there ex. kafka-connect-jdbc-5.1.0.jar, its dependencies, and your jdbc drivers.

更多细节可以找到:https://docs.confluent.io/current/connect/userguide.html#installing-plugins

这篇关于kafka jdbc sink 连接器独立错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆