0

So I know the JDBC Rivers plugin is deprecated so even though it is being used I'd ideally not want to look at using something that is no longer supported.

However I have a few tables in a Postgres database with values that I need to be able to search in a Kibana view. Im new to the ELK stack but i've been messing around with some of their samples to get familiar.

I've seen some mentions of using Stored Procedures/Triggers from Postgres to send to Logstash. Although im not sure if this is the best way. Im not a developer but a QA so my coding skills are "ok" as im used to writing automation tests/etc...

What would be the best way to do this? I would want to probably capture updates to these tables (probably new inserts or updates) OR be able to poll the data every X period of time (30s or something). Lets pretend it's for a weather station and the tables contain humidity data from different weather sensors.

I'd want to be able to search in a Kibana view the Values/Station ID/etc...

Is this doable? Is there maybe a better way than using Triggers/Stored procedures?

msmith1114
  • 2,717
  • 3
  • 33
  • 84
  • 1
    these answers might help: http://stackoverflow.com/questions/34477095/elasticsearch-replication-of-other-system-data/34477639#34477639 + http://stackoverflow.com/questions/40410920/elasticsearch-usage-with-mysql/40415430#40415430 + http://stackoverflow.com/questions/31995648/logstash-jdbc-connector-time-based-data/32001923#32001923 – Val Mar 02 '17 at 05:23

1 Answers1

1

I ended up using the JDBC driver and following https://www.elastic.co/blog/logstash-jdbc-input-plugin to get it moving and working (Which it does move). But it was a lot of setup for anyone that may see this answer.

msmith1114
  • 2,717
  • 3
  • 33
  • 84