I'm beginning to use Kubeflow and I've read about KFServing. There is this integration with Kafka to consume from a topic. What I need is to run an inference pipeline that consumes data from Kafka, does some data preproccessing, run the ML prediction and then publish the result back to Kafka, like this:
I know I could use Knative eventing to ingest data fro Kafka topic 1. Is there a way to use Kafka as a sink, or do I need to write a seperate container that publishes events to Kafka topic 2?
Thanks!