4

How to handle over 10 million records in MySQL only read operations. I have two table one is company which holds records of company i.e its name and the services provided by it, thus 2 column and has about 3 million records and another table employee which has about 40 columns and about 10 million records. When trying to fetch data even simple queries such as

Employee::paginate(100);
//In terms of mysql it
//select count(*)  as aggregate from 'employee'; 
//Select *  from 'employee' limit 100 offset 0;

used to take about 30s and now it takes like forever.

If I want to do a search, apply a filter or wants to join two table i.e company and employee then sometimes it works and sometimes it crashes and gives lots of errors/warning in the SQL server logs. Can anyone please tell me how can I handle this volume of records more efficiently without causing SQL server meltdown especially not during high traffic time. Currently, I have only primary keys i.e ids and joint ids are indexed.

This is kind of duplicate post compare to all similar queries has been made on SO, but those did not helped me much.

Questions i have followed on SO

  1. Best data store for billions of rows
  2. Handling very large data with mysql
  3. https://dba.stackexchange.com/questions/20335/can-mysql-reasonably-perform-queries-on-billions-of-rows
  4. Is InnoDB (MySQL 5.5.8) the right choice for multi-billion rows?
  5. How big can a MySQL database get before performance starts to degrade
  6. Handling millions of records in arrays from MySQL in PHP?
Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
user7747472
  • 1,874
  • 6
  • 36
  • 80
  • 'employee' is a string, so your sample queries don't make a whole lot of sense – Strawberry Oct 27 '18 at 09:24
  • @Strawberry I am using eloquent ORM . So i didn't use raw sql query directly. I wrote that just to give an idea what that eloquent query will turn into. – user7747472 Oct 27 '18 at 11:46

1 Answers1

2

Make these changes:

Ali Farhoudi
  • 5,350
  • 7
  • 26
  • 44
  • I will try with `partitioning `, but I don't think it will help much because of all the records and uploaded in one day. Can not use `simplePaginate` because I want to show the total number of records for the search or the filters and page wise pagination. – user7747472 Oct 27 '18 at 06:26
  • Partitioning can be done with various conditions. But it depends on your queries. I personally have applied based on date since all of my queries depend on date. It helps me a lot. You can implement your custom pagination. You can use redis to save your data count with different conditions. For example `employees.it.male = 30`, `employees.it.female = 30`, `employees.it.total = 60` and when adding new male employee in IT section just increment `employees.it.male` value. And then when filtering by department and sex, you can use these values to get total number of records. – Ali Farhoudi Oct 27 '18 at 08:14