在Spark Streaming中,有没有一种方法可以检测批处理何时完成? [英] In Spark Streaming, is there a way to detect when a batch has finished?
问题描述
我在Cloudera 5.8.3中使用Spark 1.6.0.
我有一个DStream
对象,并在其上面定义了许多转换,
I use Spark 1.6.0 with Cloudera 5.8.3.
I have a DStream
object and plenty of transformations defined on top of it,
val stream = KafkaUtils.createDirectStream[...](...)
val mappedStream = stream.transform { ... }.map { ... }
mappedStream.foreachRDD { ... }
mappedStream.foreachRDD { ... }
mappedStream.map { ... }.foreachRDD { ... }
是否有一种方法可以注册最后一个foreachRDD
,该最后一个foreachRDD
仅在以上foreachRDD
完成执行后才能保证最后执行?
换句话说,当Spark UI显示作业已完成时-那就是我要执行轻量级功能的时候.
Is there a way to register a last foreachRDD
that is guaranteed to be executed last and only if the above foreachRDD
s finished executing?
In other words, when the Spark UI shows that the job was complete - that's when I want to execute a lightweight function.
API中是否有可以实现该目标的东西?
Is there something in the API that allows me to achieve that?
谢谢
推荐答案
使用流式侦听器应为您解决问题:
Using streaming listeners should solve the problem for you:
(对不起,这是一个Java示例)
(sorry it's a java example)
ssc.addStreamingListener(new JobListener());
// ...
class JobListener implements StreamingListener {
@Override
public void onBatchCompleted(StreamingListenerBatchCompleted batchCompleted) {
System.out.println("Batch completed, Total delay :" + batchCompleted.batchInfo().totalDelay().get().toString() + " ms");
}
/*
snipped other methods
*/
}
https://gist.github.com/akhld/b10dc491aad1a2007183
这篇关于在Spark Streaming中,有没有一种方法可以检测批处理何时完成?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!