i have micro service project generated by Jhipster when run in local host i dont have any problem but when deploy my code in server by docker images i have this error with kafka says:
[Consumer clientId=consumer-2, groupId=groupId] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.
i am search about this error but without any solution
application.yml
kafka:
// bootstrap-servers: 206.189.178.228:9092
consumer:
bootstrap-servers: 206.189.178.228:9092 //i am try kafka:9092
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: org.apache.kafka.common.serialization.StringDeserializer
group.id: store-service
auto.offset.reset: earliest
producer:
bootstrap-servers: 206.189.178.228:9092 //i am try kafka:9092
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: org.apache.kafka.common.serialization.StringSerializer
kafka config:
@Configuration
@ConfigurationProperties(prefix = "kafka")
public class KafkaProperties {
private String bootStrapServers = "206.189.178.228:9092";
private Map<String, String> consumer = new HashMap<>();
private Map<String, String> producer = new HashMap<>();
public String getBootStrapServers() {
return bootStrapServers;
}
public void setBootStrapServers(String bootStrapServers) {
this.bootStrapServers = bootStrapServers;
}
public Map<String, Object> getConsumerProps() {
Map<String, Object> properties = new HashMap<>(this.consumer);
if (!properties.containsKey("bootstrap.servers")) {
properties.put("bootstrap.servers", this.bootStrapServers);
}
return properties;
}
public void setConsumer(Map<String, String> consumer) {
this.consumer = consumer;
}
public Map<String, Object> getProducerProps() {
Map<String, Object> properties = new HashMap<>(this.producer);
if (!properties.containsKey("bootstrap.servers")) {
properties.put("bootstrap.servers", this.bootStrapServers);
}
return properties;
}
public void setProducer(Map<String, String> producer) {
this.producer = producer;
}
}
and security config:
@Override
public void configure(HttpSecurity http) throws Exception {
// @formatter:off
http
.csrf()
.disable()
.exceptionHandling()
.authenticationEntryPoint(problemSupport)
.accessDeniedHandler(problemSupport)
.and()
.headers()
.contentSecurityPolicy("default-src 'self'; frame-src 'self' data:; script-src 'self' 'unsafe-inline' 'unsafe-eval' https://storage.googleapis.com; style-src 'self' 'unsafe-inline'; img-src 'self' data:; font-src 'self' data:")
.and()
.referrerPolicy(ReferrerPolicyHeaderWriter.ReferrerPolicy.STRICT_ORIGIN_WHEN_CROSS_ORIGIN)
.and()
.featurePolicy("geolocation 'none'; midi 'none'; sync-xhr 'none'; microphone 'none'; camera 'none'; magnetometer 'none'; gyroscope 'none'; speaker 'none'; fullscreen 'self'; payment 'none'")
.and()
.frameOptions()
.deny()
.and()
.sessionManagement()
.sessionCreationPolicy(SessionCreationPolicy.STATELESS)
.and()
.authorizeRequests()
.antMatchers("/api/auth-info").permitAll()
.antMatchers("/api/store-service-kafka/publish").permitAll()
.antMatchers("/api/**").authenticated()
my docker-compose kafka.yml :
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.5.0
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
networks:
- food_default
kafka:
image: confluentinc/cp-kafka:5.5.0
ports:
- 9092:9092
environment:
KAFKA_BROKER_ID: 1
KAFKA_ADVERTISED_HOST_NAME: kafka
LISTENERS: PLAINTEXT://206.189.178.228:9092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
networks:
- food_default
networks:
food_default:
external: true
this is documentation from jhipster : kafka with jhipster