38

We started using Serilog in combination with Elasticsearch, and it's a very efficient way to store structure log data (and later visualize them using tools like Kibana). However, I see the advantage of not writing log data directly to the backend but instead configure a log broker such as Logstash that can take responsibility for adding tags to log messages, selecting indexes etc. With this setup applications won't need to have knowledge of log data distribution.

With Logstash in the middle the question is what Serilog sink is best to use so Logstash can import its data without applying advanced and CPU-intensive filters. I've seen Redis mentioned as a good companion to Logstash, but Serilog doesn't have a Redis sink. Any recommendations for Serilog sink which data can be easily transferred by Logstash to an Elasticsearch index?

There is even an approach to use Elasticsearch sink first and then loopback it to Elasticsearch again after some arrangements and applying extra tags.

Vagif Abilov
  • 9,835
  • 8
  • 55
  • 100

5 Answers5

58

The accepted answer was written before the sink Serilog.Sinks.Http existed.

Instead of logging to file and having Filebeat monitoring it, one could have the HTTP sink post log events to the Logstash HTTP input plugin. This would mean fewer moving parts on the instances where the logs where created.

FantasticFiasco
  • 1,133
  • 1
  • 8
  • 11
18

I received a suggestion from Nicholas Blumhardt (Serilog creator) to use RollingFileSink with JsonFormatter.

Vagif Abilov
  • 9,835
  • 8
  • 55
  • 100
  • 1
    Yes it did! I only need to figure out best way to log exceptions but that's food for another question. – Vagif Abilov Aug 15 '14 at 23:47
  • ...and presumably using lumberjack to ship? – Ben Collins Aug 14 '15 at 16:47
  • 1
    RollingFileSink saves logs to file. How can you use it to transfer logs to Logstash? – wziska Jun 14 '16 at 13:16
  • filebeat can ship logs in file to logstash. https://www.elastic.co/products/beats/filebeat – Shetty Aug 08 '16 at 14:23
  • Can we customize the format of JSON, we want all elements to be on same level in JSON. I can only configure with properties elemet, which is not accepted by team, which implements ELK. – Krtti Dec 15 '16 at 01:11
9

I've created a sink for Serilog that supports RabbitMQ, which ties extremely well into logstash, using logstash's rabbitmq input-plugin:

https://www.nuget.org/packages/Serilog.Sinks.RabbitMQ

If you run an instance of RabbitMQ on your application-server, you can then log with Serilog to this RabbitMQ instance using the RabbitMQSink, ultimately by-passing the network-segregation/downage scenarioes.

Logstash will then pick-up the messages on the queue when network is back up.

Update: The Sink is now in V2.0 and supports ASP.NET Core 1.0

0

Perhaps the most up-to-date answer would be:

Use the ElasticSearch Sink.

It does include a durability mode so logs aren't lost if the application is offline.

Bruno Garcia
  • 6,029
  • 3
  • 25
  • 38
  • 3
    If you want to reuse filtering and processing setup across multiple incoming logs, introducing Logstash still seems like a valid move... I don't think this sink is compatible with Logstash out of the box... so not sure this can qualify as an answer... – chaami Dec 17 '19 at 11:34
  • @chaami Correct - This sink is not currently compatible with LogStash, but I think the answer should stay, as it may be relevent for those who come looking and realize they actually just want to communicate directly with ElasticSearch. In addition, I suspect LogStash may be supported at a later date, since Beats for instance can talk to both LogStash and ElasticSearch. – Chris Jensen Feb 24 '21 at 10:06
0

I used this solution and worked for me.
1- Logstash with pipeline as :

input {
  http {
    host => "0.0.0.0"
    port => 31000
    codec => json
    }
}
#filter {
#   
#}
output {
    #stdout {}
    elasticsearch {
        hosts => ["127.0.0.1:9200"]
        index => "logstash-%{+YYYY.MM.dd}"
    }
}

2-Serilog config as:

{
  "Serilog": {
    "Using": [ "Serilog.Sinks.Http" ],
    "MinimumLevel": "Information",
    "WriteTo": [
      {
        "Name": "Async",
        "Args": {
          "bufferSize": 10000000,
          "configure": [
            {
              "Name": "Http",
              "Args": {
                "requestUri": "http://127.0.0.1:31000",
                "textFormatter": "Serilog.Formatting.Elasticsearch.ElasticsearchJsonFormatter, Serilog.Formatting.Elasticsearch",
                "batchFormatter": "Serilog.Sinks.Http.BatchFormatters.ArrayBatchFormatter, Serilog.Sinks.Http"
              }
            }
          ]
        }
      }
    ],
    "Properties": {
      "Application": "Test1"
    }
  },
  "AllowedHosts": "*"
}