0

I have created a simple java HelloWorld app. When I start Kafka and then the app, all messages are logged to Kafka and my app finishes with exit code 0. However, the app never finishes if I do not start Kafka beforehand. It keeps waiting till Kafka is available then it logs all messages and terminates with exit code 0.

How can I force log4j2 to terminate without having to start Kafka? I tried

LogManager.shutdown();

but it has no affect. Thank you.

HelloWorld

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.core.LifeCycle;

public class HelloWorld {
    private static final Logger logger = LogManager.getLogger(HelloWorld.class);

    public static void main(String[] args) throws InterruptedException {
        logger.info("test1");
        logger.error("test2");

        LogManager.shutdown();
    }
}

log4j2.xml

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" shutdownHook="disabled">
    <Appenders>
        <Kafka name="Kafka" topic="mylog-events" syncSend="false">
            <JsonTemplateLayout eventTemplateUri="classpath:jsonLayout.json"/>
            <Property name="bootstrap.servers">localhost:9092</Property>
        </Kafka>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} [%p] - %t - %m%n"/>
        </Console>
        <Async name="AsyncKafka">
            <AppenderRef ref="Kafka"/>
        </Async>
    </Appenders>
    <Loggers>
        <AsyncLogger name="org.apache.kafka" level="INFO" additivity="false">
            <AppenderRef ref="Console"/>
        </AsyncLogger>
        <AsyncRoot level="DEBUG">
            <AppenderRef ref="Console"/>
            <AppenderRef ref="AsyncKafka"/>
        </AsyncRoot>
    </Loggers>
</Configuration>

Maven dependencies

<dependencies>
    <!-- Apache Kafka Clients-->
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>3.0.0</version>
    </dependency>
    <!-- Apache Kafka Streams-->
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-streams</artifactId>
        <version>3.0.0</version>
    </dependency>
    <!-- Apache Log4J2 binding for SLF4J -->
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-slf4j-impl</artifactId>
        <version>2.14.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-layout-template-json</artifactId>
        <version>2.14.1</version>
    </dependency>

    <dependency>
        <groupId>com.lmax</groupId>
        <artifactId>disruptor</artifactId>
        <version>3.4.4</version>
    </dependency>
</dependencies>

edit

The application terminates as expected if I stop the Kafka appender first.

public class HelloWorld {
    // ...

    public static void main(String[] args) throws InterruptedException {
        // ...

        LoggerContext context = (LoggerContext) LogManager.getContext(false);
        KafkaAppender kafkaAppender = context.getConfiguration().getAppender("Kafka");
        kafkaAppender.stop();

        LogManager.shutdown();
    }
}

This is where it hangs. enter image description here

It is very likely that the problem might be caused by KAFKA-3539

MPeli
  • 570
  • 1
  • 8
  • 19
  • How much time does the application hang when there is no Kafka started? – geobreze Oct 14 '21 at 14:03
  • Good question. I have just found out that it depends on how many messages have been logged. Each message adds an additional minute. The app terminates in one minute if one message has been logged. It is two minutes for two messages and so on. – MPeli Oct 14 '21 at 15:18
  • I think that delivery timeout is the case here. You can adjust this option in your XML config http://kafka.apache.org/documentation.html#producerconfigs_delivery.timeout.ms. But I would recommend considering some kind of health check to identify when something goes wrong – geobreze Oct 14 '21 at 15:43
  • Thank you. Changing delivery.timeout.ms had no affect. I have set [max.block.ms](http://kafka.apache.org/documentation.html#producerconfigs_max.block.ms) to 2000ms and now the app gets terminated way sooner. But it still depends on number of logged messages. The termination time = number of messages * 2000ms. – MPeli Oct 14 '21 at 16:13
  • But why do you need that behaviour? Is it because prod env uses Kafka and locally you don't want to use it? If so, I would propose to use maven profiles for different log4j configs [here](https://stackoverflow.com/questions/9543219/how-to-configure-maven-to-use-different-log4j-properties-files-in-different-envi) or if you're using Spring, it will be even [easier](https://stackoverflow.com/a/38701226/4655217) – geobreze Oct 14 '21 at 16:30
  • Production apps use Kafka and they have to start or continue working even if Kafka is down. Our C++ apps use librdkafka and there are no such issues. – MPeli Oct 14 '21 at 16:39
  • I see that you're using `syncSend="true"` in your configuration. This makes the thread block. When `librdkafka` is producing messages in an async manner not making thread to block – geobreze Oct 14 '21 at 17:02
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/238148/discussion-between-mpeli-and-geobreze). – MPeli Oct 14 '21 at 17:29

0 Answers0