So I have a homework assignment, and we need to generate random numbers between 1 and 100 in C. I have a working example with int i = rand()%100.
But according to the homework that is technically incorrect which I don't really get. The Homework explanation is as follows
"1.1 We use a random number generator to simulate bus arrival times. ===> the rand( ) function.The rand( ) function returns a pseudo random number 0 to RAND_MAX (2^31-1 in linux).To generate a random number, rn, between 0.0 and 1.0; rn = rand( ) / RAND_MAX.(by the way, a lot of people do below to create, say, 2 digit random numbers. r_num = rand( ) % 100; since % 100 is 0 to 99. However, this is wrong. The right way of generate 2 digit random number is: divide 0-RAND_MAX in 10 intervals and see where the random number falls. The interval time is, it = RAND_MAX / 100. Then, map it to one of 0 - 99 by the following: 0 1 2 3 ......... 99 0 it 2it 3it 99it to RAND_MAX If the rand( ) returns a number is between (12it) and (13*it), the 2 digit random number is 12.)"
I was hoping someone could take a stab at explaining what it is saying, I'm not really looking for code examples just an understanding of the problem.