I'm running into some issues sending log data to my logstash instance from a simple java application. For my use case, I'm trying to avoid using log4j logback and instead batch json events on separate lines through a raw tcp socket. The reason for that is I'm looking to send data through a aws lambda function to logstash which means storing logs to disk may not work out.
My logstash configuration file looks like the following:
input {
tcp {
port => 5400
codec => json
}
}
filter{
json{
source => "message"
}
}
output {
elasticsearch {
host => "localhost"
protocol => "http"
index => "someIndex"
}
}
My java code right now is just opening a tcp socket to the logstash server and sending an event directly.
Socket socket = new Socket("logstash-hostname", 5400);
DataOutputStream os = new DataOutputStream(new BufferedOutputStream(socket.getOutputStream()));
os.writeBytes("{\"message\": {\"someField\":\"someValue\"} }");
os.flush();
socket.close();
The application is connecting to the logstash host properly (if logstash is not up an exception is thrown when connecting), but no events are showing up in our ES cluster. Any ideas on how to do this are greatly appreciated!
I don't see any relevant logs in logstash.err, logstash.log, or logstash.stdout pointing to what may be going wrong.