Spring Cloud 数据流中的 Kafka 源 [英] Kafka Source in Spring Cloud Data Flow

查看:45
本文介绍了Spring Cloud 数据流中的 Kafka 源的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在从 Spring XD 迁移到 Spring Cloud Data Flow.当我在寻找模块列表时,我意识到 Spring Cloud Flow 中没有列出一些源——其中之一是 KAFKA 源.

I am migrating from Spring XD to Spring Cloud Data Flow. When I am looking for module list I realised that some of the sources are not listed in Spring Cloud Flow - One of them is KAFKA source.

我的问题是为什么 KAFKA 源从 Spring Cloud 数据流中的标准源列表中删除?

My question is why KAFKA source is removed from standard sources list in spring cloud data flow ?

推荐答案

当我在寻找模块列表时,我发现一些源没有在 Spring Cloud Flow 中列出

When I am looking for module list I realised that some of the sources are not listed in Spring Cloud Flow

大部分应用程序被移植,其余应用程序按优先级递增 - 您可以在 积压.

Majority of the applications are ported over and the remaining are incrementally prioritized - you can keep track of the remaining subset in the backlog.

我的问题是为什么 KAFKA 源从 Spring Cloud 数据流中的标准源列表中删除?

My question is why KAFKA source is removed from standard sources list in spring cloud data flow ?

Kafka 没有被删除,事实上,我们对 Kafka 在流用例的上下文中非常有意见,以至于它被直接烘焙到 DSL 中.更多细节这里.

Kafka is not removed and in fact, we are highly opinionated about Kafka in the context of streaming use-cases so much so that it is baked into the DSL directly. More details here.

例如,

(i) 如果您必须从 Kafka 主题(作为源)进行消费,您的流定义将是:

(i) if you've to consume from a Kafka topic (as a source), your stream definition would be:

stream create --definition ":someAwesomeTopic > log" --name subscribe_to_broker --deploy

(ii) 如果您必须写入 Kafka 主题(作为接收器),您的流定义将是:

(ii) if you've to write to a Kafka topic (as a sink), your stream definition would be:

stream create --definition "http --server.port=9001 > :someAwesomeTopic" --name publish_to_broker --deploy

(其中 *someAwesomeTopic* 是命名目的地,主题名称)

这篇关于Spring Cloud 数据流中的 Kafka 源的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆