I have an application that was developed in 2001. I have a table named Locations
which is being used by my application and some of my business partners to insert/update/delete data from that particular table.
This table has around 1 million rows. Performance wise, it's working fine and we have indexes or using other techniques to get the data.
Now, I want to implement full text search on it and SQL Server full text search is not fast enough and I decided to use ElasticSearch for it. For this, I need to ingest data from SQL Server into Elastic search that I can do using logstash but I'm confused about the following points:
- How I will update the Elastic Search Index if some new record is added?
- How I will update the Elastic Search Index if some record is updated?
- How I will update the Elastic Search Index if some record is deleted?
If we have one source to update the locations table then I can think about it but there are too many clients those can update the locations table. What should I do in this case?