1

i'm creating a ecommerce web applicaiton using PHP and MYSQL(MYISAM). i want to know how to speed up my queries

I have a products table with over a million records with following columns: id (int, primary) catid(int) usrid (int) title (int) description (int) status (enum) date(datetime)

recently i split this one table into multiple tables based on the product categories(catid). thinking that it might reduce the load on the server.

Now i need to fetch results from these tables combined with following sets of conditions 1. results matching a usrid and status. (to fetch a users products) 2 results matching status and title or description (eg: for product search)

now currently i have to use UNION to fetch results from these all tables combined which is slowing down the permormance also i can't apply the LIMIT to the combined result set also. I thought of creating an index on all these columns to speed up the searching but this might slow down the INSERTS and UPDATES. also i'm begingin to think that splitting the table was not a good idea in the first place.

i would like to know the best approach to optimize the data retrieval in such a situation. I'm open to new database schema proposals as well.

avib
  • 69
  • 6
  • Can you please provide us with a sample Database-Layout? Also, did you consider using `JOIN`? – Bobby Dec 23 '10 at 10:58
  • It would be helpful if you gave a overview of your current database schema, with the relevant tables and keys. For tables with millions of records, I suggest not UNION-ing or JOIN-ing them to other tables at all, but doing single queries on the table and letting the script combine your results. – Oli Dec 23 '10 at 11:00
  • 1
    Regarding indexes: http://stackoverflow.com/questions/4377525/using-more-than-one-index-per-table-is-dangerous/4382833#4382833 – Ronnis Dec 23 '10 at 11:05
  • @bobby join wont give my the desired result here if i want to search in all the tables together. – avib Dec 23 '10 at 11:34
  • @oli combining the results with the script was a thought that has crossed my mind and will be my last resort. specially becoz applying a limit on the UNION query will not be possible. – avib Dec 23 '10 at 11:37
  • also please tell me what else you need to know about my database other that that i've not already provided – avib Dec 23 '10 at 11:48

2 Answers2

1

To start: load test and turn on the MySQL slow query log.

Some other suggestions:

If staying with separate tables per category use UNION ALL instead of UNION. Reason being UNION implies distinctness, which makes the database engine do extra work to dedupe the rows unnecessarily.

Indices do add a write penalty, but what you describe probably has a read-write ratio of at least 10 to 1 and probably more like 1000 to 1 or higher. So index. For the two queries you describe, I would probably create three indices (you'll need to study explain plans to determine what column order is better).

  1. usrid and status
  2. status and title
  3. status and description (is this an indexable field?)

Another note on indices, creating a covering index, that is one that has all your columns, can also be a useful solution if one of your frequent access patterns is retrieval by primary key.

Community
  • 1
  • 1
orangepips
  • 9,891
  • 6
  • 33
  • 57
0

Have you considered using memcached? It caches the resultset from database queries on the server and returns them if they are requested by multiple users. If it doesn't find a cache resultset, only then will it query the database. It should alleviate the load on the database significantly.

http://memcached.org/

Matt Asbury
  • 5,644
  • 2
  • 21
  • 29
  • i've observed that mysql inherentely provides the caching mechanisms for queries. but i'm looking for optimization from query and schema design point of view. – avib Dec 23 '10 at 11:28