3

We have a setup where a program logs to a .Json file, in a format that follows the GELF specification.

Currently this is sent to a Graylog2 server using HTTP. This works, but due to the nature of HTTP there's a significant latency, which is an issue if there is a large amount of log messages.

I want to change the HTTP delivery method to UDP, in order to just 'fire and forget'.

The logs are written to files like this:

{ "short_message": "<message>", "host": "<host>", "full_message": "<message>", "_extraField1": "<value>", "_extraField2": "<value>", "_extraField3": "<value>" }

Current configuration is this:

<Extension json>
    Module xm_json
</Extension>

<Input jsonLogs>
    Module        im_file
    File          '<File Location>'
    PollInterval  5
    SavePos       True
    ReadFromLast  True
    Recursive     False
    RenameCheck   False
    CloseWhenIdle True
</Input>

<Output udp>
    Module        om_udp
    Host          <IP>
    Port          <Port>
    OutputType    GELF_UDP
</Output>

With this setup, part of json log message is added to the "message" field of a GELF message, and sent to the server.

I've tried adding the line `Exec parse_json(), but this will simply result in all fields other than short_message and full_message being excluded.

I'm unsure how to configure this correctly. Even just having the complete log message added to a field is preferable, since I can add an extractor on the server side.

NT93
  • 316
  • 2
  • 15

1 Answers1

0

You'd need Exec parse_json() in order for GELF_UDP to generate proper output but it was unclear what the exact issue is with message and full/short_message.

Another option you could try is simply ship the log via om_tcp. In this case you'll not need to use OutputType GELF_TCP since it is already formatted that way.

b0ti
  • 2,319
  • 1
  • 18
  • 18
  • When I used parse_json(); the Graylog server would accept the message, but all the "custom" fields had been stripped. – NT93 Jan 10 '18 at 09:07