0

I have 2 services: producer and consumer.

As far as I understand, message.ts is the time the producer produced the message (not the time the kafka-broker received the message).

Questions

  1. When the consumer consume the message, how can I know how much time it was inside the kafka-broker (without the network latency: from producer to kafka-broker and from kafka-broker to consumer)?

  2. I did a ping from my consumer vm to the kafka broker. the ping result was 0.7ms (millisecond). Does the network latency from each side to the kafka broker is 0.3ms? I assume kafka transport is TCP so there is a "ACK" message for everything. And I assume that each side won't do nothing without "ACK" so I conclude that the network latency on each size is the same as the ping result: 0.7ms (millisecond). Am I correct?

Stav Alfi
  • 13,139
  • 23
  • 99
  • 171

2 Answers2

0

It's a little more complicated than that. Many variables go into how long it takes to process a message. I suggest you look into Distributed Tracing. Something like Zipkin works like magic and is very easy to setup and use. Here's a tutorial on how to setup Zipkin tracing with Spring Boot. You can even use it with Kafka Connect with an interceptor, here's the one I use: brave-kafka-interceptor.

Zipkin produces a trace for every message including all producers and consumers that processed it. Those traces end up lookin something like this:

enter image description here

You can see how much time a message took to be processed, and how much time it took to be consumed afte being produced, which is what you're looking for.

Odai Mohammed
  • 279
  • 1
  • 5
  • 18
  • I don't have yet distributed tracing. it's a great tool and I will look at that. But for now, I need answers regardless if I have distributed-tracing. – Stav Alfi Jul 14 '21 at 07:55
0

I tested manually this by producing and consuming from the same vm to a kafka (which was inside my cluster). The result was 1.3-1.5 ms.

It means that the procssing time took 0.1 ms on average.

  • I produced a new message every 1 second to avoid delay while consuming.

This is not the best solution, but it is suffient to my research.

Stav Alfi
  • 13,139
  • 23
  • 99
  • 171