I'm working on a Spring Data Flow + Kafka architecture, that I deployed on a Kubernetes cluster.
We have an existing public REST API (a Spring boot application, outside the Kubernetes cluster), that is sending message to a Azure Event Hub.
We now want this API the initiate streams on Spring Data Flow, I had two ideas for this but I'm not sure how to implement them :
- Developing a Spring Data Flow Source application that is able to read messages in Azure Event Hub : but I don't understand if I need to declare the Event Hub as a binder ?
- Send messages directly from our API to Spring Data Flow's Kafka instance : is this a good practice ? Can I use any topic in Kafka and use it as a source of a Processor ?
Maybe I'm missing something in this architecture, but after reading many documentations, I still don't understand how to achieve the link with our API.
Thanks