1

We are using Serilog, but the docker runtime has a 16k limit for logs to standard out. So large logs get written on multiple lines. FluentD then reads each line as a separate log. This is a known issue.

Is there any way that Serilog can detect a large log message and then split it into smaller logs that are below the limit?

VAS
  • 8,538
  • 1
  • 28
  • 39

1 Answers1

0

What you want to do seems to be possible using a custom sink implementation that you pipe your logs to. The way Serilog works, you can "chain" multiple sinks so that the log entries flow from one to the other like in a pipeline.

You could create a custom sink implementation that inspected the size of the log event and split it into multiple events to be logged if the threshold is reached. Then, you add this new sink right before your last sink (the one that pushes the log events to your log aggregator).

I don't think there is anything native to Serilog to do this out of the box. You'd need to define your own constraints and the logic for splitting an event (what to do with the properties, etc) by yourself.

If a specific property is causing you trouble, the logic would be simpler since you could just split that single property and generate multiple cloned events (all other properties kept intact) and just pass the parts of the problematic message to them in sequence.

Alternatively, if the property(ies) causing the issue are not very relevant to you, you could trim them in an enricher implementation. The enricher can inspect a specific property and do any transformations on it. Note that enrichment cannot split events by itself: it is a mechanism to augment existing log entries.

julealgon
  • 7,072
  • 3
  • 32
  • 77
  • Thanks! yeah, we guessed this could be done with a sink. But if this has to be custom built then we are looking for an example sink to modify. An example showing how to detect size and then split the logs would be helpful. – user15776274 May 14 '21 at 12:47