如何使用骆驼-avro-消费者&制片人? [英] How to use camel-avro-consumer & producer?

查看:104
本文介绍了如何使用骆驼-avro-消费者&制片人?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我没有看到如何使用camel-avro组件生成和使用kafka avro消息的示例?目前我的骆驼路线是这样.为了与使用骆驼-kafka-avro消费者&制片人.

I dont see an example of how to use camel-avro component to produce and consume kafka avro messages? Currently my camel route is this. what should it be changed in order to work with schema-registry and other props like this using camel-kafka-avro consumer & producer.

props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);              
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true); 

public void configure() {
        PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class); 
        pc.setLocation("classpath:application.properties");

        log.info("About to start route: Kafka Server -> Log ");

        from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
                + "&maxPollRecords={{consumer.maxPollRecords}}"
                + "&consumersCount={{consumer.consumersCount}}"
                + "&seekTo={{consumer.seekTo}}"
                + "&groupId={{consumer.group}}"
                +"&valueDeserializer="+KafkaAvroDeserializer.class
                +"&keyDeserializer="+StringDeserializer.class
                )
                .routeId("FromKafka")
            .log("${body}");

推荐答案

我正在回答自己的问题,因为我在这个问题上呆了几天.我希望这个答案对其他人有帮助.

I'm answering my own question because I sat on this problem for couple days. I hope this answer will be helpful for others.

我尝试使用io.confluent.kafka.serializers.KafkaAvroDeserializer反序列化器,但出现了kafka异常.所以我必须编写自己的解串器才能执行以下操作:

I tried to use io.confluent.kafka.serializers.KafkaAvroDeserializer deserializer and got kafka exception. so i had to write my own deserializer to do following things:

  1. 设置架构注册表
  2. 使用特定的Avro阅读器(这意味着不是默认的stringDeserializer)

然后,我们必须访问"schemaRegistry","useSpecificAvroReader"并设置AbstractKafkaAvroDeserializer(io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer)的那些字段

then we must access "schemaRegistry", "useSpecificAvroReader" and set those fields of the AbstractKafkaAvroDeserializer(io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer)

这是解决方案...

public static void main(String[] args) throws Exception {
    LOG.info("About to run Kafka-camel integration...");
    CamelContext camelContext = new DefaultCamelContext();
    // Add route to send messages to Kafka
    camelContext.addRoutes(new RouteBuilder() {
        public void configure() throws Exception {                
            PropertiesComponent pc = getContext().getComponent("properties", 
                                      PropertiesComponent.class);
            pc.setLocation("classpath:application.properties");

            log.info("About to start route: Kafka Server -> Log ");

            from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
                    + "&maxPollRecords={{consumer.maxPollRecords}}"
                    + "&consumersCount={{consumer.consumersCount}}"
                    + "&seekTo={{consumer.seekTo}}"
                    + "&groupId={{consumer.group}}"
                    + "&keyDeserializer="+ StringDeserializer.class.getName() 
                    + "&valueDeserializer="+CustomKafkaAvroDeserializer.class.getName()
                    )
                    .routeId("FromKafka")
                .log("${body}");

        }
    });
    camelContext.start();
    // let it run for 5 minutes before shutting down
    Thread.sleep(5 * 60 * 1000);
    camelContext.stop();
}

DESERIALIZER类别-设置schema.registry.url&在抽象AbstractKafkaAvroDeserializer级别使用use.specific.avro.reader.如果不设置此项,我会得到kafka-config-exception.

package com.example.camel.kafka.avro;

import java.util.Collections;
import java.util.List;
import java.util.Map;


import io.confluent.common.config.ConfigException;
import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
import io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer;
import io.confluent.kafka.serializers.KafkaAvroDeserializerConfig;
import org.apache.kafka.common.serialization.Deserializer;



public class CustomKafkaAvroDeserializer extends AbstractKafkaAvroDeserializer
    implements Deserializer<Object> {

    private static final String SCHEMA_REGISTRY_URL = "http://localhost:8081";

    @Override
    public void configure(KafkaAvroDeserializerConfig config) {

     try {
          final List<String> schemas = 
                              Collections.singletonList(SCHEMA_REGISTRY_URL);
          this.schemaRegistry = new CachedSchemaRegistryClient(schemas, 
                                  Integer.MAX_VALUE);
          this.useSpecificAvroReader = true;

       } catch (ConfigException e) {
              throw new org.apache.kafka.common.config.ConfigException(e.getMessage());
     }
   }

  @Override
  public void configure(Map<String, ?> configs, boolean isKey) {
    configure(null);
  }

  @Override
  public Object deserialize(String s, byte[] bytes) {
    return deserialize(bytes);
  }

  @Override
  public void close() {
  }
}

这篇关于如何使用骆驼-avro-消费者&amp;制片人?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆