First of all, there is no specified standard.
Possible options:
- Java EE WEB application
- Spring WEB application
- Application with Spring-kafka (
@KafkaListener
)
Kafka producer will potentially accept some commands. In real-life scenario I worked with applications which runs continuously with listeners, receiving requests they triggered some jobs, batches and etc.
It could be achieved using, for example:
- Web-server accepting HTTP requests
- Standalone Spring application with
@KafkaListener
Consumer could be a spring application with @KafkaListener
.
@KafkaListener(topics = "${some.topic}")
public void accept(Message message) {
// process
}
Spring application with @KafkaListener
will run infinitely by default. The listener containers created for @KafkaListener
annotations are registered with an infrastructure bean of type KafkaListenerEndpointRegistry
. This bean manages the containers' lifecycles; it will auto-start any containers that have autoStartup
set to true
. KafkaMessageListenerContainer
uses TaskExecutor
for performing main KafkaConsumer
loop.
Documentation for more information.
If you decide to go without any frameworks and application servers, the possible solution is to create listener in separate thread:
public class ConsumerListener implements Runnable {
private final Consumer<String, String> consumer = new KafkaConsumer<>(properties);
@Override
public void run() {
try {
consumer.subscribe(topics);
while (true) {
// consume
}
}
} finally {
consumer.close();
}
}
}