Flink 流未完成 [英] Flink stream not finishing

查看:46
本文介绍了Flink 流未完成的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 kafka 和 elasticsearch 设置 flink 流处理器.我想重放我的数据,但是当我将并行度设置为 1 以上时,它没有完成程序我相信这是因为 kafka 流只看到一条消息被标识为流的结尾.

I am settings up a flink stream processor using kafka and elasticsearch. I want to replay my data, but when i set the parallelism to more than 1, It does not finish the program I believe this to be because that only one message is seen by the kafka stream to be identified as the end of the stream.


    public CustomSchema(Date _endTime) {
        endTime = _endTime;
    }

@Override
    public boolean isEndOfStream(CustomTopicWrapper nextElement) {
        if (this.endTime != null && nextElement.messageTime.getTime() >= this.endTime.getTime()) {
            return true;
        }
        return false;
    }

有没有办法在一个线程完成后告诉flink消费者组上的所有线程结束?

is there a way to tell all threads on the flink consumer group to end once one thread has completed?

推荐答案

如果您实现了自己的 SourceFunction,请使用 cancel 方法,就像 Flink SourceFunction.类 FlinkKafkaConsumerBase 也有取消方法.

if you implemented your own SourceFunction use the cancel method like this example shows from the Flink SourceFunction. the class FlinkKafkaConsumerBase also has the cancel method.

这篇关于Flink 流未完成的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆