1

Im developing a web application using expressjs and wanted to leverage the latest technology and architecture i.e kafka, microservices etc - The frontend is React and is calling the backend microservices to retrieve data.

My current architecture, consists of multiple services serving as rest api endpoints in the backend such as user service, account service, company service etc

All these services work well and fine, but having to introduce kafka into the mix, i now require to publish a 'new user' event when a client registers for an account -> the user service publishes this event but then now require the accounts service to consume it.

Should i be creating a new subscriber service individually consume this event, connecting to the same db as the account service (though doesn't this defeat the purpose of1 database per service microservice architecture)? or should the accounts service that is acting as a rest api endpoint also consume the kafka event (doesn't this also then complicate things when theres 20+ microservices, spending time checking what service is consuming what event)?

I'd like to know what the best approach is with this kind of situation.

Pon Pon
  • 57
  • 5

1 Answers1

2

In general, the microservices will have rest apis for providing any business/CRUD capabilities and the Kafka broker will mostly be used for achieving eventual consistency and also for triggering any actions(by dedicated Kafka consumers) asynchronously. Now to your particular question -

Should i be creating a new subscriber service individually consume this event, connecting to the same db as the account service (though doesn't this defeat the purpose of1 database per service microservice architecture)?

The microservices will have their own data stores which may require to be consistent/in-sync with data stores belonging to other microservices. You can created dedicated Kafka topics for relevant events, for e.g. "User_Resource" could be a Kafka topic where you could publish all the events(CRUD) related to User resource. These topics can be subscribed by other microservices and the consumers will have logic to handle these events ( update account service database, trigger notifications to other down-streams etc.). This will also create clean separation between CRUD and business services.

or should the accounts service that is acting as a rest api endpoint also consume the kafka event (doesn't this also then complicate things when theres 20+ microservices, spending time checking what service is consuming what event)?

A service which exposes a rest endpoint can also act as a Kafka producer/consumer. If your application is built using Spring boot and spring cloud framework, you can use spring-cloud-stream to handle Kafka interactions in simplest way. The services need not to be bothered about the state of other services as they are supposed to be independent.

nakul shukla
  • 138
  • 1
  • 8