Hadoop作业如何自行杀死 [英] How can Hadoop job kill by itself
本文介绍了Hadoop作业如何自行杀死的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
是否有任何方法可以杀死Hadoop作业本身或发送信号来杀死它. 我已经从jobConf阅读了Configuration设置,其中说如果用户指定了错误的设置,我需要终止工作或抛出错误,因为map/reduce config 方法不允许抛出异常
Is there any way to kill a Hadoop job itself or send a signal to kill it. I've read the Configuration settings from jobConf where it says that if a user specify the wrong settings I need to kill the job or throw an error, since map/reduce config method does not allow throwing an exception.
public void configure(JobConf job) {
System.out.println("Inside config start processing");
try {
String strFileName = job.get("hadoop.rules");
LoadFile(strFileName );
} catch (Exception e) {
e.printStackTrace();
//Here i need to write code to kill job
}
}
推荐答案
在configure()方法中,只需抛出RuntimeException.
In the configure() method, just throw a RuntimeException.
最好,如果可能的话,最好在作业运行之前执行验证步骤.
Better yet, if possible, you're better off performing your validation step before the job is run.
这篇关于Hadoop作业如何自行杀死的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文