在Spark Streaming中,有没有一种方法可以检测批处理何时完成? [英] In Spark Streaming, is there a way to detect when a batch has finished?

查看:159
本文介绍了在Spark Streaming中,有没有一种方法可以检测批处理何时完成?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Cloudera 5.8.3中使用Spark 1.6.0.
我有一个DStream对象,并在其上面定义了许多转换,

I use Spark 1.6.0 with Cloudera 5.8.3.
I have a DStream object and plenty of transformations defined on top of it,

val stream = KafkaUtils.createDirectStream[...](...)
val mappedStream = stream.transform { ... }.map { ... }
mappedStream.foreachRDD { ... }
mappedStream.foreachRDD { ... }
mappedStream.map { ... }.foreachRDD { ... }

是否有一种方法可以注册最后一个foreachRDD,该最后一个foreachRDD仅在以上foreachRDD完成执行后才能保证最后执行?
换句话说,当Spark UI显示作业已完成时-那就是我要执行轻量级功能的时候.

Is there a way to register a last foreachRDD that is guaranteed to be executed last and only if the above foreachRDDs finished executing?
In other words, when the Spark UI shows that the job was complete - that's when I want to execute a lightweight function.

API中是否有可以实现该目标的东西?

Is there something in the API that allows me to achieve that?

谢谢

推荐答案

使用流式侦听器应为您解决问题:

Using streaming listeners should solve the problem for you:

(对不起,这是一个Java示例)

(sorry it's a java example)

ssc.addStreamingListener(new JobListener());

// ...

class JobListener implements StreamingListener {

 @Override
    public void onBatchCompleted(StreamingListenerBatchCompleted batchCompleted) {

        System.out.println("Batch completed, Total delay :" + batchCompleted.batchInfo().totalDelay().get().toString() +  " ms");

    }

   /*

   snipped other methods

   */


}

https://gist.github.com/akhld/b10dc491aad1a2007183

https://jaceklaskowski. gitbooks.io/mastering-apache-spark/content/spark-streaming/spark-streaming-streaminglisteners.html

http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.streaming.scheduler.StreamingListener

这篇关于在Spark Streaming中,有没有一种方法可以检测批处理何时完成?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆