Apache Flink(如何唯一标记作业) [英] Apache Flink (How to uniquely tag Jobs)

查看:27
本文介绍了Apache Flink(如何唯一标记作业)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否可以使用唯一名称标记作业以便我可以在以后停止它们?.我真的不想 grep 和持久化作业 ID.

Is it possible to tag jobs with a unique name so I can stop them at a later date?. I don't really want to grep and persist Job IDs.

简而言之,作为部署的一部分,我想停止一项工作并部署新工作.

In a nutshell I want to stop a job as part of my deployment and deploy the new one.

推荐答案

您可以在 execute(name: String) 调用中启动作业时为其命名,例如,

You can name jobs when you start them in the execute(name: String) call, e.g.,

val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment()

val result: DataStream[] = ???       // your job logic
result.addSink(new YourSinkFunction) // add a sink

env.execute("Name of your job")      // execute and assign a name

RESTJobManager 的 API 提供了一个作业详细信息列表,其中包括作业名称及其 JobId.

The REST API of the JobManager provides a list of job details which include the name of the job and its JobId.

这篇关于Apache Flink(如何唯一标记作业)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆