- I am a college undergrad working on a PHP and MySQL based inventory management system operating on a country-wide level. Its database size is projected to increase by about 1 million plus entries every month with current size of about 2 million.
I need to prevent the exponential increase in query time which is currently ranges from 7-11 seconds for most modules.
The thing is that the probability of accessing data entered in the last month is much higher as compared to any older data. So I believe partitioning of data on the basis of time of data entry should be able to keep the query time in check. So how can I achieve this.
Specifically speaking I want to have a way to cache the last month's data so that every query searches for the product in the tables having recent data and should search rest of the data in case it is not found in the last 1 month's data.
Asked
Active
Viewed 470 times
0

Community
- 1
- 1
-
You may find the following article of interest: http://chrismoos.com/2010/01/31/mysql-partitioning-tables-with-millions-of-rows/ As a side note, it may be useful to know how your tables are indexed (7-11 seconds seems like a long query time). – cgwyllie Aug 12 '12 at 21:53
-
Also potentially interesting: http://stackoverflow.com/questions/1114619/mysql-performance-on-6-million-row-table and http://stackoverflow.com/questions/3695768/table-with-80-million-records-and-adding-an-index-takes-more-than-18-hours-or-f – cgwyllie Aug 12 '12 at 21:54
1 Answers
0
If you want to use the partitioning functions of MySQL, have a look at this article.
That being said, there are a few restrictions when using partitions :
- you cant have indexes that are not in the partition key
- you loose some database portability as partitioning works quite differently with other databases.
You can also handle partitioning manually, by moving old records to an archive table at regular intervals. Of course, you will then have to also implements different code to read those archived records.
Also note that your query time seems quite long. I have worked with table much larger than 2 million records with much better access time.

Guillaume
- 18,494
- 8
- 53
- 74
-
Thanks Guillaume, that was really helpful. Could you suggest me any php-based APIs that can help me build the data archiving feature you mentioned – Aug 13 '12 at 19:11
-
No, I really cant help you with PHP, my expertise is more with Java, sorry. – Guillaume Aug 13 '12 at 20:53