1

I need to be able to select 3 Unique random MySQL ID rows from a table that is progressively growing and growing over time, AND have a WHERE 'status' = 'available'. So it only picks the id number if status = available.

There will be no gaps as all data is kept for record keeping compliance, if data is deleted by the user it is not actually deleted just STATUS is marked 'deleted'.

I've read many posts about the subject but it all seems to boil down to picking just 1 result, and then repeating it 3 times, problem is, it might pick the same ID again, so any solution to my problem will be much appreciated.

Thank you

:)

I was looking at doing something like this (taken from here ... MySQL select 10 random rows from 600K rows fast)

SELECT id
  FROM table AS r1 JOIN
       (SELECT (RAND() *
                     (SELECT MAX(id)
                        FROM table)) AS id2)
        AS r2
 WHERE r1.id >= r2.id2 AND status = 'available'
 ORDER BY r1.id ASC
 LIMIT 1

Which gives me what I need, but it only returns 1 random Unique 'pick', plus IF I limit it to 3, it does not always pick 3, and if it does pick 3 they always seem to be sequential which is not random!

Is there a way to repeat the query, store the ID picked and keep looping / running the query until 3 ID's are picked AT RANDOM that are Different. This is my question :)

Community
  • 1
  • 1

3 Answers3

2

Did you try:

SELECT *
FROM mytable
WHERE status = 'available'
ORDER BY RAND( )
LIMIT 0,3;
ceyquem
  • 2,079
  • 1
  • 18
  • 39
  • just add the conditions : `WHERE status = 'available'` – Kobi Nov 30 '13 at 14:40
  • This is what OP has left on my answer (deleted, I couldn'n understand) " I need a fast answer, sorry my post not explain properly, over time might have many 100's thousands of rows, and not always 'limit 3' will return 3, sometimes only return 1 or 2 –" – Jason Heo Nov 30 '13 at 15:27
  • This does a full table scan, so not "fast". – Rick James May 07 '17 at 03:26
0
SELECT DISTINCT *
FROM table
WHERE status = 'available'
ORDER BY RAND()
LIMIT 3

The DISTINCT clause shall prevent repeated values. ORDER BY RAND() will randomly organize the result.

Byte Bit
  • 513
  • 5
  • 12
  • nice, I forgot about the distinct, however will this be FAST on a table that contains say 1 million+ rows? – user3052268 Dec 01 '13 at 09:17
  • Well, you won't select a million rows to display in a page, you'd create paginations, right? And if you are creating a query for statistics or rankings, you should process the data and store it somewhere else. That could take time, but it will be done only once per day or hour (just as example). Anyway, once you reach millions of rows, most operations should take some time. – Byte Bit Dec 01 '13 at 18:50
  • This does a full table scan, so not "fast". – Rick James May 07 '17 at 03:26
0

See the following; where it talks about "gaps", you have the equivalent (status=deleted) http://mysql.rjweb.org/doc.php/random#case_auto_increment_with_gaps_1_or_more_rows_returned (and look around in that link for variants)

  • Requirement: AUTO_INCREMENT, possibly with gaps due to DELETEs, etc
  • Flaw: Only semi-random (rows do not have an equal chance of being picked), but it does partially compensate for the gaps
  • Flaw: The first and last few rows of the table are less likely to be delivered.

This gets 50 "consecutive" ids (possibly with gaps), then delivers a random 10 of them.

-- First select is one-time:
SELECT @min := MIN(id),
       @max := MAX(id)
    FROM RandTest;
SELECT a.*
    FROM RandTest a
    JOIN ( SELECT id FROM
            ( SELECT id
                FROM ( SELECT @min + (@max - @min + 1 - 50) *
                         RAND() AS start FROM DUAL ) AS init
                JOIN RandTest y
                WHERE    y.id > init.start
                ORDER BY y.id
                LIMIT 50           -- Inflated to deal with gaps
            ) z ORDER BY RAND()
           LIMIT 10                -- number of rows desired
         ) r ON a.id = r.id;

Yes, it is complex, but yes, it is fast, regardles of the table size.

Rick James
  • 135,179
  • 13
  • 127
  • 222