1

I have two streams of events

  • L = (l1, l3, l8, ...) - is sparser and represents user logins to a IP
  • E = (e2, e4, e5, e9, ...) - is a stream of logs the particular IP

the lower index represents a timestamp... If we joined the two streams together and sorted them by time we would get:

  • l1, e2, l3, e4, e5, l8, e9, ...

Would it be possible to implement custom Window / Trigger functions to group the event to sessions (time between logins of different users):

  • l1 - l3 : e2
  • l3 - l8 : e4, e5
  • l8 - l14 : e9, e10, e11, e12, e13
  • ...

The problem which I see is that the two streams are not necessarily sorted. I thought about sorting the input stream by time-stamps. Then it would be easy to implement the windowing using GlobalWindow and custom Trigger - yet it seems that it is not possible.

Am I missing something or is it definitely not possible to do so in current Flink (v1.3.2)?

Thanks

Vojtech Letal
  • 2,168
  • 3
  • 13
  • 17

1 Answers1

4

Question: shouldn't E3 come before L4?

Sorting is pretty straightforward using a ProcessFunction. Something like this:

public static class SortFunction extends ProcessFunction<Event, Event> {
  private ValueState<PriorityQueue<Event>> queueState = null;

  @Override
  public void open(Configuration config) {
    ValueStateDescriptor<PriorityQueue<Event>> descriptor = new ValueStateDescriptor<>(
        // state name
        "sorted-events",
        // type information of state
        TypeInformation.of(new TypeHint<PriorityQueue<Event>>() {
        }));
    queueState = getRuntimeContext().getState(descriptor);
  }

  @Override
  public void processElement(Event event, Context context, Collector<Event> out) throws Exception {
    TimerService timerService = context.timerService();

    if (context.timestamp() > timerService.currentWatermark()) {
      PriorityQueue<Event> queue = queueState.value();
      if (queue == null) {
        queue = new PriorityQueue<>(10);
      }
      queue.add(event);
      queueState.update(queue);
      timerService.registerEventTimeTimer(event.timestamp);
    }
  }

  @Override
  public void onTimer(long timestamp, OnTimerContext context, Collector<Event> out) throws Exception {
    PriorityQueue<Event> queue = queueState.value();
    Long watermark = context.timerService().currentWatermark();
    Event head = queue.peek();
    while (head != null && head.timestamp <= watermark) {
      out.collect(head);
      queue.remove(head);
      head = queue.peek();
    }
  }
}

Update: see How to sort an out-of-order event time stream using Flink for a description of a generally better approach.

David Anderson
  • 39,434
  • 4
  • 33
  • 60
  • Hello David, i think you will have a serialization / deser performance issue with this value state, if you get and update it at every message (w Rocksdb) – Eldinea Nov 16 '19 at 06:28
  • Yes, that's right. It's better to use MapState with RocksDB. – David Anderson Nov 16 '19 at 07:39
  • Yes thanks ! with timestamp as key, this could be a good idea ? – Eldinea Nov 17 '19 at 08:45
  • What I don't understand when `onTimer` fires it will release all events `<= watermark`. If I would have a window operation after the sort aren't there a lot of late arrivals since events get emitted smaller than the watermark? – emilio Aug 19 '22 at 08:17