11

I'm running into some issues sending log data to my logstash instance from a simple java application. For my use case, I'm trying to avoid using log4j logback and instead batch json events on separate lines through a raw tcp socket. The reason for that is I'm looking to send data through a aws lambda function to logstash which means storing logs to disk may not work out.

My logstash configuration file looks like the following:

input {
  tcp {
        port => 5400
        codec => json
  }
}
filter{
  json{
    source => "message"
  }
}
output {
   elasticsearch {
      host => "localhost"
      protocol => "http"
      index => "someIndex"
   }
}

My java code right now is just opening a tcp socket to the logstash server and sending an event directly.

Socket socket = new Socket("logstash-hostname", 5400);
DataOutputStream os = new DataOutputStream(new BufferedOutputStream(socket.getOutputStream()));
os.writeBytes("{\"message\": {\"someField\":\"someValue\"} }");
os.flush();
socket.close();

The application is connecting to the logstash host properly (if logstash is not up an exception is thrown when connecting), but no events are showing up in our ES cluster. Any ideas on how to do this are greatly appreciated!

I don't see any relevant logs in logstash.err, logstash.log, or logstash.stdout pointing to what may be going wrong.

user1553248
  • 1,184
  • 2
  • 19
  • 33
  • If you define the input as json, you shouldn't need to run the json filter. To see what logstash is doing, add a "stdout{ codec=>rubydebug }" output stanza and see what ends up in your stdout logs. – Alain Collins Feb 02 '16 at 07:22
  • having same issue.. seems that tcp is listening by netcat.. but logstash is not receiving or is processing the input?!? – daparic Dec 15 '17 at 15:35

3 Answers3

27

The problem is that your data is already deserialized on your input and you are trying to deserialize it again on your filter. Simply remove the json filter.

Here is how I recreated your scenario:

# the json input
root@monitoring:~# cat tmp.json 
{"message":{"someField":"someValue"}}


# the logstash configuration file
root@monitoring:~# cat /etc/logstash/conf.d/test.conf
input {
  tcp {
    port => 5400
    codec => json
  }
}

filter{
}

output {
   stdout {
     codec => rubydebug
   }
}


# starting the logstash server
/opt/logstash/bin/logstash -f /etc/logstash/conf.d/test.conf


# sending the json to logstash with netcat
nc localhost 5400 <  tmp.json

# the logstash output via stdout
{
       "message" => {
        "someField" => "someValue"
    },
      "@version" => "1",
    "@timestamp" => "2016-02-02T13:31:18.703Z",
          "host" => "0:0:0:0:0:0:0:1",
          "port" => 56812
}

Hope it helps,

alfredocambera
  • 3,155
  • 34
  • 29
2

Don't forget to append \n at the end of your JSON:

os.writeBytes("{\"bossMessage\": {\"someField\":\"someValue\"} }" + '\n');
Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
0

I found that when using the codec on the TCP Input node then the JSON parsing fails on unknown reason. Removing the codec on TCP input and adding filter only resolved my JSON parsing issues.

   input {
  tcp {
        port => 5400
  }
}
filter{
  json{
    source => "message"
  }
}
output {
   elasticsearch {
      host => "localhost"
      protocol => "http"
      index => "someIndex"
   }
}

Also I write the whole string at once and do not send the "\n" in the message else Logstash gets an empty record with code "\r" only.

PrintWriter pw = new PrintWriter(os);
pw.println(stringOut);
pw.flush();

This will append "\r" to the end of the stream.

JakesIV
  • 41
  • 4