如何在不停机的情况下部署新工作 [英] how to deploy a new job without downtime

查看:150
本文介绍了如何在不停机的情况下部署新工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个Apache Flink应用程序,该应用程序从单个Kafka主题读取. 我想不时停机地更新应用程序.目前,Flink应用程序通过http rest API执行一些简单的运算符,例如map和一些到外部系统的同步IO.

I have an Apache Flink application that reads from a single Kafka topic. I would like to update the application from time to time without experiencing downtime. For now the Flink application executes some simple operators such as map and some synchronous IO to external systems via http rest APIs.

我尝试使用stop命令,但是我收到作业终止(STOP)失败:此作业不可停止.",我了解到Kafka连接器不支持Stop行为-

I have tried to use the stop command, but i get "Job termination (STOP) failed: This job is not stoppable.", I understand that the Kafka connector does not support the the stop behavior - a link! A simple solution would be to cancel with savepoint and to redeploy the new jar with the savepoint, but then we get downtime. Another solution would be to control the deployment from the outside, for example, by switching to a new topic.

什么是好习惯?

推荐答案

如果您不需要一次输出(即可以容忍某些重复项),则可以获取一个保存点而不取消正在运行的作业.保存点完成后,您将开始第二项工作.第二项工作可以写到不同的主题,但不必这样做.当第二份工作完成时,您可以取消第一份工作.

If you don't need exactly-once output (i.e., can tolerate some duplicates) you can take a savepoint without cancelling the running job. Once the savepoint is completed, you start a second job. The second job could write to different topic but doesn't have to. When the second job is up, you can cancel the first job.

这篇关于如何在不停机的情况下部署新工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆