Apache Flink(如何唯一标记作业) [英] Apache Flink (How to uniquely tag Jobs)
本文介绍了Apache Flink(如何唯一标记作业)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
是否可以用唯一的名称标记作业,以便以后可以停止它们?我真的不想grep并保留作业ID.
Is it possible to tag jobs with a unique name so I can stop them at a later date?. I don't really want to grep and persist Job IDs.
简而言之,我想在部署过程中停止一项工作,然后部署新任务.
In a nutshell I want to stop a job as part of my deployment and deploy the new one.
推荐答案
您可以在 execute(name:String)
调用中启动作业时为作业命名,例如
You can name jobs when you start them in the execute(name: String)
call, e.g.,
val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment()
val result: DataStream[] = ??? // your job logic
result.addSink(new YourSinkFunction) // add a sink
env.execute("Name of your job") // execute and assign a name
RESTJobManager的API 提供了作业详细信息列表,其中包括作业名称及其JobId.
The REST API of the JobManager provides a list of job details which include the name of the job and its JobId.
这篇关于Apache Flink(如何唯一标记作业)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文