I have a Laravel web app that's using a VueJS front-end and MySQL as the RDBMS. I currently have a table that is 23.8gb and contains 8m+ rows and it's growing every second.
When querying this data, I'm joining it to 4 other tables so the entire dataset is humongous.
I'm currently only pulling and displaying 1000 rows as I don't need anymore than that. VueJS is showing the data in a table and there are 13 filter options for the user to select from to filter the data ranging from date, name, status, etc.
Using Eloquent and having MySQL indexes in place, I've managed to get the query time down to a respectable time but I need this section of the app to be as responsive as possible.
Some of the where clauses that kick off from the filters are taking 13 seconds to execute which I feel is too long.
I've been doing some reading and thinking maybe MongoDB or Redis may be an option but have very little experience with either.
For this particular scenario, what do you think would be the best option to maximise read performance?
If I were to use MongoDB, I wouldn't migrate the current data... I'd basically have a second database that contains all the new data. This app hasn't gone into production yet and in most use cases, only the last 30 days worth of data will be required but the option to query old data is still required hence keeping both MySQL and MongoDB.
Any feedback will be appreciated.