0

Let us say I have a table called animals that looks like this:

1 red     fox
2 blue    elephant
3 red     dog
4 yellow  cat
5 green   mouse
6 blue    rat

In effect what I would like to do is make a temporary table (a virtual table) that only contains rows of the value I specify. For example red:

1 red     fox
3 red     dog

and then grab a random row from this virtual table.

What I don't want to do is to permanently create these tables and have to constantly update them grabbing information from the main animals table. I would like to do this on the fly and quickly using some sort of subquery, view, temporary table or whatever you would like to call it.

Please keep in mind that I am a complete php noob and could use as much clarification as possible involving syntax.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Eric Brotto
  • 53,471
  • 32
  • 129
  • 174

1 Answers1

3

You can use a sql query like following

SELECT * FROM table
where color = 'red'
ORDER BY RAND()
LIMIT 1 

You can change limit to any number of rows that you want to fetch

Sap
  • 5,197
  • 8
  • 59
  • 101
  • 2
    This solution works, but it is real performance killer on large applications. Maybe better would be to select all ID's containing `red`, then select random ID from them and then select one row by this ID. – Peter Krejci Jan 08 '12 at 11:40
  • Well, I am not a PHP guru but may be you can get two random numbers between 1...n where n is total number of rows your query will fetch. and select two results from those positions. – Sap Jan 08 '12 at 11:44
  • @PeterKrejci: but that's exactly what the query does. could you elaborate what implementation you had in mind? – just somebody Jan 08 '12 at 11:44
  • For OP's Benefit: The SQL above will achieve the desired results but as peter indicated; somewhat, it will be slow. see [this article](http://stackoverflow.com/questions/1244555/how-can-i-optimize-mysqls-order-by-rand-function) as to why. The point here is scaling when your list of 'red' contains hundreds of thousands, millions or billions of records. Initially what Grrrrr defined will work. but if you're dealing with LARGE results, then Peter's response has merit. – xQbert Jan 08 '12 at 12:04