I've been looking into the int rand()
function from <stdlib.h>
in C11 when I stumbled over the following cppreference-example for rolling a six sided die.
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
int main(void)
{
srand(time(NULL)); // use current time as seed for random generator
int random_variable = rand();
printf("Random value on [0,%d]: %d\n", RAND_MAX, random_variable);
// roll a 6-sided die 20 times
for (int n=0; n != 20; ++n) {
int x = 7;
while(x > 6)
x = 1 + rand()/((RAND_MAX + 1u)/6); // Note: 1+rand()%6 is biased
printf("%d ", x);
}
}
Specifically this part:
[...]
while(x > 6)
x = 1 + rand()/((RAND_MAX + 1u)/6); // Note: 1+rand()%6 is biased
[...]
Questions:
Why the addition of
+ 1u
? Sincerand()
is[0,RAND_MAX]
I'm guessing that doingrand()/(RAND_MAX/6) -> [0,RAND_MAX/(RAND_MAX/6)] -> [0,6]
? And since it's integer division(LARGE/(LARGE+small)) < 1 -> 0
, adding1u
gives it the required range of[0,5]
?Building on the previous question, assuming
[0,5]
,1 + (rand()/((RAND_MAX+1u)/6))
should only go through[1,6]
and never trigger a second loop?
Been poking around to see if rand()
has returned float
at some point, but
that seems like a pretty huge breakage towards old code? I guess the check
makes sense if you add 1.0f
instead of 1u
making it a floating point
division?
Trying to wrap my head around this, have a feeling that I might be missing something..
(P.s. This is not a basis for anything security critical, I'm just exploring the standard library. D.s)