7

I’m looking for a best/newest way to send Spring Boot application logs directly into Elasticsearch server without using Filebeats or Logstash. How can I do that? Is there a simple/modern way in Spring Boot or using any good/reputed library to achieve that?

What I need is directly sending logs from Spring Boot to Elasticsearch without any middle service like Logstash. If a third party library that can add to pom.xml and if it is doing that completely that is fine. I need the Spring application itself to handle this. I have checked some similar questions in Stack Overflow.

But some libraries are deprecated now and some are not updated for a long time. I like to know about a new library or a way to do this now? Basically what it is writing to console should be sent to Elasticsearch.

halfer
  • 19,824
  • 17
  • 99
  • 186
H Athukorala
  • 739
  • 11
  • 32
  • Business applications are meant to solve the business problems, not the monitoring, so the best practice is to configure log4j and push to log files. This would be the lightweight process in the application that doesn't depend on the elasticsearch. Use logstash pipeline to index the logs to elasticsearch. – Jinna Balu Nov 01 '21 at 16:37
  • Also another reason why it's not a good idea to have your application push log files directly to ES is that if ES is down for maintenance, you're losing logs and if ES is slow, your application is slow as well. – Val Nov 03 '21 at 06:39
  • This has been already answered, please have a look: https://stackoverflow.com/a/45627472/5659521 – Amit Meena Nov 08 '21 at 09:06

3 Answers3

9

You could include the logback-elastic-appender library in your pom.xml and use the included com.internetitem.logback.elasticsearch.ElasticsearchAppender class as the appender in the logging config in logback.xml file

<dependency>
    <groupId>com.internetitem</groupId>
    <artifactId>logback-elasticsearch-appender</artifactId>
    <version>1.6</version>
</dependency>

Is that one of the libraries you didn't want to use because it hasn't been updated recently? If so, you could write a custom appender and point to it in the logback.xml file. A simple implementation of the appender could look like below:

package com.example.demo;

import java.time.Instant;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;

import org.springframework.web.reactive.function.client.WebClient;

import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.AppenderBase;
import reactor.core.publisher.Mono;

public class ElasticSearchAppender extends AppenderBase<ILoggingEvent> {
    private static final String ELASTIC_SEARCH_API_HOST = "http://localhost:9200";
    private static final String ELASTIC_SEARCH_INDEX_NAME = "dummy-index";
    private static final WebClient webClient = WebClient.create(ELASTIC_SEARCH_API_HOST);
    private static final Logger LOGGER = Logger.getLogger(ElasticSearchAppender.class.getName());
    public static final DateTimeFormatter ISO_8601_FORMAT = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX")
            .withZone(ZoneId.systemDefault());

    @Override
    protected void append(ILoggingEvent eventObject) {
        Map<String, Object> loggingEvent = new LinkedHashMap<>();
        loggingEvent.put("@timestamp", 
                ISO_8601_FORMAT.format(Instant.ofEpochMilli(eventObject.getTimeStamp())));
        loggingEvent.put("message", eventObject.getMessage());
        
        // Add additional fields like MDC
        
        webClient.post()
            .uri("/{logIndex}/_doc", ELASTIC_SEARCH_INDEX_NAME)
            .bodyValue(loggingEvent)
            .retrieve()
            .bodyToMono(Void.class)
            .onErrorResume(exception -> {
                LOGGER.log(Level.SEVERE, "Unable to send log to elastic", exception);
                return Mono.empty();
            })
            .subscribe();
    }

}

logback.xml:

<configuration>
    <appender name="ELASTIC" class="com.example.demo.ElasticSearchAppender" />
    <root level="INFO">
        <appender-ref ref="ELASTIC"/>
    </root>
</configuration>
devatherock
  • 2,423
  • 1
  • 8
  • 23
  • Two answers in one here but beware if you write your own _you_ have to handle all the failure cases like dropped or slow connections. – drekbour Nov 03 '21 at 21:18
  • Works great for me. Only word of caution: In my case, the logs were not displayed in Kibana at first. They only appeared once I renamed the "@timestamp" from the above example to simply "timestamp". Then everything worked great. Not sure why it is "@timestamp" in this example. – Kira Resari Nov 15 '22 at 10:30
-2

Create your self-host Elastic server, add elastic.jar to the project and add Elastic configurations to your Dockerfile:

java -javaagent:elastic-apm-agent-1.0.jar \
 -Delastic.apm.service_name=project_name \
 -Delastic.apm.server_urls=http://elastic_server_url \
 -Delastic.apm.secret_token= \
 -Delastic.apm.environment=$ACTIVE_PROFILE \
 -Delastic.apm.application_packages=package_name \
halfer
  • 19,824
  • 17
  • 99
  • 186
ZahraAsgharzade
  • 311
  • 3
  • 10
-5

You can use the LOGGER that Spring boot provides:

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
    
private static final Logger LOGGER = LogManager.getLogger(yourClass.class);

LOGGER.info("PRINT THIS OUT");

This should show up in a file called "Messages" in elastic beanstalk.

This comes from spring-boot-starter-web dependency.

    <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
Daniel Widdis
  • 8,424
  • 13
  • 41
  • 63
semiColon
  • 205
  • 6
  • 24