Flink流未完成 [英] Flink stream not finishing

查看:95
本文介绍了Flink流未完成的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用kafka和elasticsearch设置flink流处理器.我想重播数据,但是当我将并行度设置为大于1时,它并没有完成程序,我相信这是因为kafka流仅看到一条消息被标识为流的末尾.

I am settings up a flink stream processor using kafka and elasticsearch. I want to replay my data, but when i set the parallelism to more than 1, It does not finish the program I believe this to be because that only one message is seen by the kafka stream to be identified as the end of the stream.


    public CustomSchema(Date _endTime) {
        endTime = _endTime;
    }

@Override
    public boolean isEndOfStream(CustomTopicWrapper nextElement) {
        if (this.endTime != null && nextElement.messageTime.getTime() >= this.endTime.getTime()) {
            return true;
        }
        return false;
    }

有没有一种方法可以告诉flink使用者组中的所有线程在一个线程完成后结束?

is there a way to tell all threads on the flink consumer group to end once one thread has completed?

推荐答案

如果您实现自己的SourceFunction,请使用cancel方法,如Flink

if you implemented your own SourceFunction use the cancel method like this example shows from the Flink SourceFunction. the class FlinkKafkaConsumerBase also has the cancel method.

这篇关于Flink流未完成的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆