0

Visitor opens url, for example

/transport/cars/audi/a6

or

/real-estate/flats/some-city/city-district

I plan separate table for cars and real-estate (separate table for each top level category).

Based on url (php explode create array)

$array[0] - based on the value know which table to SELECT

And so on $array[1], $array[2] ...

For example, RealEstate table may look like:

IdOfAd  | RealEstateType  | Location1  | Location2    | TextOfAd | and so on
----------------------------------------------------------------------
 1      | flat            | City1      | CityDistric1 |  text..  |
 2      | land            | City2      | CityDistric2 |  text..  |

And mysql query to display ads would be like:

SELECT `TextOfAd`, `and so on...` 
WHERE RealEstateType = ? AND Location1 =? AND Location2 = ? 
// and possibly additional  AND .. AND
LIMIT $start, $limit

Thinking about performance. Hopefully after some long time number of active ads would be high (also i plan not to delete expired ads, just change column value to 0 not to display for SELECT; but display if directly visit from search engine).

What i need to do (change database design or SELECT in some another way), if for example number of rows in table would be 100 000 or millions?

Thinking about moving expired ads to another table (for which performance is not important). For example, from search engine user goes to some url with expired ad. At first select main table, if do not find, then select in table for expired ads. Is this some kind of solution?

Andris
  • 1,434
  • 1
  • 19
  • 34
  • Change DB type from free mysql to other db can improve with big table. Mysql works well even with some hundred thousand records. – Marco Mura Dec 10 '14 at 16:59
  • You mean hundred of thousands in each table (not whole database)? Not sure if i change from mysql... no knowledge and i expect it would take long time when i get high number of ads. But at very beginning want to choose some optimal solution for database design and coding – Andris Dec 10 '14 at 17:02
  • 1
    I've worked for a project with a db on mysql. One table(only one) got more than 300.000 rows. The db was working fine but you can plan when reaching 250K of records to ask people for a more stable db. – Marco Mura Dec 10 '14 at 17:05
  • `SELECT` query in described way would be ok? Nothing to improve? – Andris Dec 10 '14 at 17:08
  • 1
    Be sure that the field you use will be indexed to not go into full table search, it can improve a little i think. – Marco Mura Dec 10 '14 at 17:09

1 Answers1

1

Two hints:

  1. Use ENGINE=InnoDB when creating your table. InnoDB uses row-level locking, which is MUCH better for bigger tables, as this allows rows to be read much faster, even when you're updating some of them.

  2. ADD INDEX on suitable columns. Indexing big tables can reduce search times by several orders of magnitude. They're easy to forget and a pain to debug! More than once I've been investigating a slow query, realised I forgot a suitable INDEX, added it, and had immediate results on a query that used to take 15 seconds to run.

Niet the Dark Absol
  • 320,036
  • 81
  • 464
  • 592
  • Simply `ALTER TABLE categories ADD INDEX ( CategoriesUrl ) `? Is this enough? Like information here http://stackoverflow.com/a/708508/2118559? – Andris Dec 10 '14 at 17:16
  • Are you searching by `CategoriesUrl` column? – Niet the Dark Absol Dec 10 '14 at 17:19
  • Searching would be the same `SELECT ... WHERE `, based on user's input? Plan to search in text of add, by categories names. But if user navigates category/subcategory, then query would also be `SELECT` – Andris Dec 10 '14 at 17:24