10

This question is similar to Using Kafka as a (CQRS) Eventstore. Good idea?, but more implementation specific. How to use kafka as event store, when I have thousands of event "sources" (aggregate roots in DDD)? As I've read in linked question and some other places, I'll have problems with topic per source. If I split events to topics by type, it will be much easier to consume and store, but I need access to event stream of particular source. How to do event sourcing with kafka?

Community
  • 1
  • 1
wedens
  • 1,782
  • 14
  • 18
  • An entity instance based topic is a no go as you mention, as that creates a huge amount of topics (wich kafka was not designed to deal with). The only real possiblity is a Type based topic, nevertheless this creates the problem of searching the command/events that relate to only 1 specific aggregate's instance. How did you solved the problem of fast searching of events (within a type based topic) to reconstruct the entity instance? – tony _008 Nov 08 '19 at 14:18

1 Answers1

2

Post all of your event sources to a single topic with a data type (thrift?) that includes some unique identifier for each event source. Then create consumers for each event type that you are interested in and identify each with a unique consumer group name. This way each unique source consumer will have its own offset value in zookeeper. Everybody reads the whole topic but only outputs (or deals with) info from a single source (or group of sources).

ethrbunny
  • 10,379
  • 9
  • 69
  • 131