可用:预计至少有 1 个 bean 有资格作为自动装配候选 [英] available: expected at least 1 bean which qualifies as autowire candidate

查看:27
本文介绍了可用:预计至少有 1 个 bean 有资格作为自动装配候选的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想实现发送和接收 Java 序列化对象的 Kafka 生产者.我试过这个:

I want to implement Kafka producer which sends and receives Java Serialized Objects. I tried this:

制作人:

@Configuration
public class KafkaProducerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

发送对象:

@Autowired
private KafkaTemplate<String, SaleRequestFactory> kafkaTemplate;

private static String topic = "tp-sale";

private void perform(){

    Transaction transaction = new Transaction();
    transaction.setStatus(PaymentTransactionStatus.IN_PROGRESS.getText());

    Transaction insertedTransaction = transactionService.save(transaction);

    SaleRequestFactory obj = new SaleRequestFactory();
    obj.setId(100);

    ListenableFuture<SendResult<String, SaleRequestFactory>> send = kafkaTemplate.send(topic, obj);
}

消费者:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    private String groupId = "test";

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, SaleRequestFactory.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String>
    kafkaListenerContainerFactory() {

        ConcurrentKafkaListenerContainerFactory<String, String> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

//接收对象

    private static String topic = "tp-sale";

    @KafkaListener(topics = "tp-sale")
    public SaleResponseFactory transactionElavonAuthorizeProcess(@Payload SaleRequestFactory tf, @Headers MessageHeaders headers) throws Exception {

        System.out.println(tf.getId());

        SaleResponseFactory resObj = new SaleResponseFactory();
        resObj.setUnique_id("123123");

        return resObj;
    }

当我部署 Producer 时,我在部署过程中遇到错误:

When I deploy the Producer I get error during deployment:

由于 org.springframework.beans.factory.NoSuchBeanDefinitionException 异常,应用程序无法启动:没有类型为 'org.springframework.kafka.core.KafkaTemplate<java.lang.String, org.engine 的合格 bean.插件.transactions.factory.SaleRequestFactory>'可用:预计至少有 1 个 bean 有资格作为自动装配候选.依赖注解:{@org.springframework.beans.fac tory.annotation.Autowired(required=true)}

你知道我该如何解决这个问题吗?

Do you know how I can fix this issue?

我设法实现了这些改进:

I managed to implement these improvements:

制作人:

@Configuration
public class KafkaProducerConfig {

@Value(value = "${kafka.bootstrapAddress}")
private String bootstrapAddress;

@Bean
public ProducerFactory<String, SaleRequestFactory> saleRequestFactoryProducerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
    configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, SaleRequestFactorySerializer.class);
    return new DefaultKafkaProducerFactory<>(configProps);
}

@Bean
public ProducerFactory<String, String> producerFactory() {
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
    return new DefaultKafkaProducerFactory<>(configProps);
}


@Bean
public KafkaTemplate<String, SaleRequestFactory> saleRequestFactoryKafkaTemplate() {
    return new KafkaTemplate<>(saleRequestFactoryProducerFactory());
}

@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<>(producerFactory());
}

}

发送对象:

@Autowired
private KafkaTemplate<String, SaleRequestFactory> saleRequestFactoryKafkaTemplate;

private static String topic = "tp-sale";

private void perform(){

    SaleRequestFactory obj = new SaleRequestFactory();
    obj.setId(100);

    ListenableFuture<SendResult<String, SaleRequestFactory>> send = saleRequestFactoryKafkaTemplate.send(topic, obj);
}

消费者:

@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Value(value = "${kafka.bootstrapAddress}")
    private String bootstrapAddress;

    private String groupId = "test";

    @Bean
    public ConsumerFactory<String, SaleResponseFactory> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, SaleResponseFactoryDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, SaleResponseFactory> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, SaleResponseFactory> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}

//接收对象

    @KafkaListener(topics = "tp-sale")
public SaleResponseFactory transactionElavonAuthorizeProcess(@Payload SaleRequestFactory tf, @Headers MessageHeaders headers) throws Exception {

    System.out.println(tf.getId());

    SaleResponseFactory resObj = new SaleResponseFactory();
    resObj.setUnique_id("123123");

    return resObj;
}

自定义对象

    @Getter
    @Setter
    @NoArgsConstructor
    @AllArgsConstructor
    @Builder(toBuilder = true)
    public class SaleRequestFactory implements Serializable{
    
        private static final long serialVersionUID = 1744050117179344127L;
        
        private int id;
    }

public class SaleRequestFactorySerializer implements Serializable, Serializer<SaleRequestFactory> {

    @Override
    public byte[] serialize(String topic, SaleRequestFactory data) {
        // convert data to byte[]
        ByteArrayOutputStream out = new ByteArrayOutputStream();
        try
        {
            ObjectOutputStream outputStream = new ObjectOutputStream(out);
            outputStream.writeObject(data);
            out.close();
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }

        return out.toByteArray();
    }
}


    @Getter
    @Setter
    @NoArgsConstructor
    @AllArgsConstructor
    @Builder(toBuilder = true)
    public class SaleResponseFactory implements Serializable{
    
        private static final long serialVersionUID = 1744050117179344127L;
    
        private String unique_id;
    }

public class SaleResponseFactoryDeserializer implements Serializable, Deserializer<SaleResponseFactory> {

    @Override
    public SaleResponseFactory deserialize(String topic, byte[] data) {
        // convert data to SaleResponseFactory
        SaleResponseFactory saleResponseFactory = null;
        try
        {
            ByteArrayInputStream bis = new ByteArrayInputStream(data);
            ObjectInputStream in = new ObjectInputStream(bis);
            saleResponseFactory = (SaleResponseFactory) in.readObject();
            in.close();
        }
        catch (IOException | ClassNotFoundException e)
        {
            e.printStackTrace();
        }
        return saleResponseFactory;
    }
}

当我发送一些消息时出现错误:

When I send some message I get error:

13:03:53.675 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] DEBUG RecordMessagingMessageListenerAdapter[debug:296] - Listener method returned result [org.factory.SaleResponseFactory@69c400ab] - generating response message for it
13:03:53.675 [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] DEBUG RecordMessagingMessageListenerAdapter[debug:296] - No replyTopic to handle the reply: org.factory.SaleResponseFactory@69c400ab

你知道我该如何解决这个问题吗?

Do you know how I can solve this issue?

推荐答案

回答更新部分 - 我认为调试消息具有误导性,因为您没有选择将结果转发到另一个主题.更多详情这里.一切似乎都按预期工作.我认为需要有一种方法来跳过整个转发,包括当侦听器返回类型不是 Message@SendTo 使用转发所需的注释时的日志记录.我删除了一个 comment 让 spring kafka committer 看看他的想法.

Answering the update part - I believe the debug messages are misleading as you are not opting in to forward results to another topic. More details here. Everything appears to be working as expected. I think there needs to be a way to skip the entire forwarding including logging when the listener return type is not Message or @SendTo annotation is used which is required for forwarding. I've dropped a comment for spring kafka committer to see what he thinks.

这篇关于可用:预计至少有 1 个 bean 有资格作为自动装配候选的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆