0

I have a Laravel web app that's using a VueJS front-end and MySQL as the RDBMS. I currently have a table that is 23.8gb and contains 8m+ rows and it's growing every second.

When querying this data, I'm joining it to 4 other tables so the entire dataset is humongous.

I'm currently only pulling and displaying 1000 rows as I don't need anymore than that. VueJS is showing the data in a table and there are 13 filter options for the user to select from to filter the data ranging from date, name, status, etc.

Using Eloquent and having MySQL indexes in place, I've managed to get the query time down to a respectable time but I need this section of the app to be as responsive as possible.

Some of the where clauses that kick off from the filters are taking 13 seconds to execute which I feel is too long.

I've been doing some reading and thinking maybe MongoDB or Redis may be an option but have very little experience with either.

For this particular scenario, what do you think would be the best option to maximise read performance?

If I were to use MongoDB, I wouldn't migrate the current data... I'd basically have a second database that contains all the new data. This app hasn't gone into production yet and in most use cases, only the last 30 days worth of data will be required but the option to query old data is still required hence keeping both MySQL and MongoDB.

Any feedback will be appreciated.

Dally
  • 1,281
  • 4
  • 18
  • 37
  • Your question is a bit amorphous, and Redis, MongoDB, and MySQL are all very different things. Maybe you should add some information about what this data is. – Tim Biegeleisen Apr 29 '19 at 05:45
  • Hi Tim, I know they’re all different things but as this project hasn’t gone into development yet, I have the flexibility to make changes. I need a solution to maximise query read performance and if either of those 3 options can achieve that, I have the ability to implement it. The table is storing transactional data from a payment gateway for different merchants. – Dally Apr 29 '19 at 05:53

2 Answers2

0
  • Try converting the query into a stored procedure. You can execute the stored procedure like this..

    DB::select('exec stored_procedure("Param1", "param2",..)'); 
    

    or

    DB::select('exec stored_procedure(?,?,..)',array($Param1,$param2));
    

    Try this for without parameters

    DB::select('EXEC stored_procedure')
    
  • Try using EXPLAIN to optimise the performance.

    How to optimise MySQL queries based on EXPLAIN plan

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Arun P
  • 541
  • 4
  • 11
  • Hi Arun, I know how to execute stored procedures in Laravel but I don’t see how this would improve read performance. It’ll be the same query but firing as a stored procedure. I’ve ran EXPLAIN and optimised the query as much as possible. The hit in performance comes from filters that rely on joins. As an example, when searching for a name, status code or currency, the query executes very fast but when searching for a merchant name or division name, the execution time takes a massive hit as those columns reside in different tables and rely on joins. – Dally Apr 29 '19 at 07:13
  • 1
    The server's query processor will analyze the stored procedure and prepares an execution plan that is ultimately stored in a system table. Subsequently, the procedure is executed according to the stored plan. Since most of the query processing work has already been performed, stored procedures execute almost instantly. – Arun P Apr 29 '19 at 07:32
0

Try to use elasticsearch. It will speed up the read process.

Zoli
  • 1,081
  • 1
  • 8
  • 28