I was earlier using Datatables for reports in my website, which currently deals with 3 to 4 tables with around 3k to 4k records in each. Datatable either client side or server side, or even as a service worked a bit slow to render records into the DOM, also clicking the paginated links took the almost same time as it took to load the page. So I shifted to laravel paginator which comes by default, although it lacks the flexibility that datatables provide, I implemented my own ajax search on it to fulfill the same. I want to understand what other ways do devs use to show reports basically in a tabular structure which might have data from past 4 to 5 years and the page speed be good. I implemented eager loading to improve the speed which did work, but not to that extent.
1 Answers
If you don't make a huge mistake at front-end and back-end side such as loading the all data at once and paginating them in the DOM, main problem is generally about database design and queries. Of course capabilities of your server including CPU, RAM etc... is extremely important.
3 to 4 tables with around 3k to 4k records
This amount of data is really small, probably you will solve your problem when you check your indexes. If it still will be slow try to use one of the cache systems which Laravel provides.
When you reach to millions of data you can take advantage of some high-level solutions such as database sharding, elastic search, load balancing, microservices...
It is a huge subject but you can start with this topic. And don't forget each type of database has different features and solutions.
I implemented eager loading to improve the speed which did work, but not to that extent.
Also you can compare Laravel Eager Loading with Query Builder for performance here.

- 143
- 8
-
Thanks for the answer. I will surely try to implement these – Nitish Patra Dec 18 '18 at 08:53
-
1Great answer, I would like to add to have a look at chunk and cursor to improve performance – online Thomas Dec 18 '18 at 08:57