5

I've been searching and reading docs on how to import mysql data to ES and get the best solution by creating logstash file and running it every minute. But at last, it wasn't efficient compared to my requirement(I have to run each query on each table in logstash which it's not practical).

I have to import the whole MySQL Database to Elasticsearch including mappings,tables relationships and data every time a user handles my application (there are many users on the platform).

I've already read this link but it's not helping me. Any suggestions ? Thanks

Toastrackenigma
  • 7,604
  • 4
  • 45
  • 55
Atom
  • 421
  • 2
  • 6
  • 16

1 Answers1

1
  • Insert storms data into MySQL
  • Import Data to Elasticsearch using Logstash
  • Create Kibana Dashboard

You can use logstash and its jdbc input to read from your DB and push json to elasticsearch. https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

https://qbox.io/blog/migrating-mysql-data-into-elasticsearch-using-logstash

go through these and follow the instruction here.

  • 1
    Thank you @Wijayanga for this proposition, I already created logstash file and migrated one table to Elasticsearch. My purpose is to migrate whole database (or many tables) in one execution -> in the same logstash file. Do you have any idea for this process? – Atom Oct 23 '19 at 09:42
  • I think you can use multiple pipelines to accomplish this. However, I do not sure about check the configuration here. https://www.elastic.co/guide/en/logstash/current/multiple-input-output-plugins.html you cannot use multiple input of jdbc in logstash https://discuss.elastic.co/t/can-we-have-multiple-input-and-output-section-in-logstash-config-file/127778/4 – Juan Acosta Apr 23 '21 at 01:13