3

I have Spring Boot applications with slf4j/logback and look for centralized logging solution.

Now I see that I don't need to use log collector (like logstash/filebeat/rsyslog). There is a direct collector Ingest Node inside Elasticsearch (if I understand properly).

How can I integrate Logback with Ingest Node?

I would like to use Slf4j MDC and hope that integration will support MDC out of the box.

gavenkoa
  • 45,285
  • 19
  • 251
  • 303
  • 1
    The ingest node is there just for a **light pre-processing layer** it's not something that magically puts logs inside ES. You still need something to send the logs to Elasticsearch. I am not sure if there is something specific to slf4j, but there is a log4j input in Logstash: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html – Andrei Stefan May 15 '17 at 08:44
  • Thanks for interest! `filebeat` as log gathering/caching tool able to put data to Elastic search directly without `logstash`. `logstash` is pure man solution (1GiB RAM just to pull/push logs???) written in JRuby. That's for kids. I look for a way to avoid it. For `filebeat` a lot of work need to be done to pass MDC. https://www.elastic.co/products/beats/filebeat – gavenkoa May 15 '17 at 15:16
  • Filebeat is definitely lighter, but it doesn't have the same processing power as Logstash. And if you plan on using the ingest functionality in Elasticsearch and don't need the overhead of a Logstash instance, a Beat is definitely a great option. – Andrei Stefan May 15 '17 at 15:20
  • 1
    Our production flow is using slf4j/logback and a couple of logback plugins. We use https://github.com/logstash/logstash-logback-encoder to format the messages into json consumable by elastic, it can be configured to emit MDC. You could write this to file and ingest with filebeat. We use https://github.com/danielwegener/logback-kafka-appender instead to write the logs directly to a kafka topic and use logstash to consume from kafka. – roby May 23 '17 at 07:47

0 Answers0