是否可以在Apache Flink CEP中处理多个流? [英] Is it possible to process multiple streams in apache flink CEP?

查看:472
本文介绍了是否可以在Apache Flink CEP中处理多个流?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的问题是,如果我们有两个原始事件流,即烟雾温度,并且我们想确定是否是复杂事件,即火灾通过将运算符应用于原始流而发生了,我们可以在Flink中做到这一点吗?

My Question is that, if we have two raw event streams i.e Smoke and Temperature and we want to find out if complex event i.e Fire has happened by applying operators to raw streams, can we do this in Flink?

我之所以问这个问题,是因为到目前为止,我所看到的Flink CEP的所有示例仅包含一个输入流.如果我错了,请纠正我.

I am asking this question because all the examples that I have seen till now for Flink CEP include only one input stream. Please correct me if I am wrong.

推荐答案

简短回答-是的,您可以根据来自不同流源的事件类型读取和处理多个流和触发规则.

Short Answer - Yes, you can read and process multiple streams and fire rules based on your event types from the different stream source.

长答案-我有一个类似的要求,我的答案是基于以下假设:您正在从不同的kafka主题中读取不同的流.

Long answer - I had a somewhat similar requirement and My answer is based on the assumption that you are reading different streams from different kafka topics.

阅读不同的主题,这些主题在一个来源中流式传输不同的事件:

Read from different topics which stream different events in a single source:

FlinkKafkaConsumer010<BAMEvent> kafkaSource = new FlinkKafkaConsumer010<>(
        Arrays.asList("topicStream1", "topicStream2", "topicStream3"),
        new StringSerializerToEvent(),
        props);

kafkaSource.assignTimestampsAndWatermarks(new 
TimestampAndWatermarkGenerator());
DataStream<BAMEvent> events = env.addSource(kafkaSource)
        .filter(Objects::nonNull);

串行器读取数据并将其解析为通用格式-例如.

The serializer reads the data and parses them to a have a common format - For eg.

@Data
public class BAMEvent {
 private String keyid;  //If key based partitioning is needed
 private String eventName; // For different types of events
 private String eventId;  // Any other field you need
 private long timestamp; // For event time based processing 

 public String toString(){
   return eventName + " " + timestamp + " " + eventId + " " + correlationID;
 }

}

然后,事情变得非常简单,根据事件名称定义规则并比较事件名称以定义规则(您也可以如下定义复杂规则):

and after this, things are pretty straightforward, define the rules based on the event name and compare the event name for defining the rules (You can also define complex rules as follows) :

Pattern.<BAMEvent>begin("first")
        .where(new SimpleCondition<BAMEvent>() {
          private static final long serialVersionUID = 1390448281048961616L;

          @Override
          public boolean filter(BAMEvent event) throws Exception {
            return event.getEventName().equals("event1");
          }
        })
        .followedBy("second")
        .where(new IterativeCondition<BAMEvent>() {
          private static final long serialVersionUID = -9216505110246259082L;

          @Override
          public boolean filter(BAMEvent secondEvent, Context<BAMEvent> ctx) throws Exception {

            if (!secondEvent.getEventName().equals("event2")) {
              return false;
            }

            for (BAMEvent firstEvent : ctx.getEventsForPattern("first")) {
              if (secondEvent.getEventId = firstEvent.getEventId()) {
                return true;
              }
            }
            return false;
          }
        })
        .within(withinTimeRule);

我希望这能给您一个将一个或多个不同流集成在一起的想法.

I hope this gives you the idea to integrate one or more different streams together.

这篇关于是否可以在Apache Flink CEP中处理多个流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆