0

I'm trying to generate a random-number sequence with rand(). I have something like this:

#include <stdio.h>
#include <stdlib.h>
#include <time.h>


int Random(int min, int max)
{
  /* returns a random integer in [min, max] */

  double uniform; // random variable from uniform distribution of [0, 1]
  int ret; // return value
  srand((unsigned int)clock());

  uniform = rand() / (double)RAND_MAX;
  ret = (int)(uniform * (double)(max - min)) + min;

  return ret;
}


int main(void)
{
  for(int i=0; i<10; i++)
    printf("%d ", Random(0, 100));
  printf("\n");

  return 0;
}

It made different results when executed on macOS v10.14 (Mojave) and Ubuntu 18.04 (Bionic Beaver).

It works on Ubuntu:

76 42 13 49 85 7 43 28 15 1

But not on macOS:

1 1 1 1 1 1 1 1 1 1

Why doesn't it work well on macOS? Is there something different in random number generators?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
aest
  • 35
  • 1
  • 7
  • 9
    You must call srand() only once: https://stackoverflow.com/questions/46877089/strange-behaviour-of-rand-in-xcode – Hans Passant Nov 04 '20 at 12:31
  • Also, `clock()` is often a count from the start of program execution. It is more usual to pass `time()` to `srand()`. The C18 standard says of `clock()` ***7.27.2.1** implementation’s best approximation to the processor time used by the program since the beginning of an implementation-defined era related only to the program invocation.* – Weather Vane Nov 04 '20 at 12:43
  • I think that the `clock()` function behaves strangely on macos, can you display its value each time you use it? – Mathieu Nov 04 '20 at 12:44
  • 1
    see [srand() — why call it only once?](https://stackoverflow.com/q/7343833/995714). And `rand()` on mac is so shitty that you should use `arc4random()` on it instead, see [Why does rand() % 7 always return 0?](https://stackoverflow.com/q/7866754/995714), [Rand() % 14 only generates the values 6 or 13](https://stackoverflow.com/q/20263187/995714) – phuclv Nov 04 '20 at 13:02
  • 1
    In addition to the other comments, it's likely that your Mac implementation is running much faster than your ubuntu implementation. Perhaps you are running ubuntu in a VM on your Mac? – President James K. Polk Nov 04 '20 at 14:22

3 Answers3

2

I'm a Mac user. To generate random numbers I initialise the seed like this:

srand(time(NULL));

Plus, try initialising it in your main.

1

If reproducible "random" numbers are something you care about, you should avoid the rand function. The C standard doesn't specify exactly what the sequence produced by rand is, even if the seed is given via srand. Notably:

  • rand uses an unspecified random number algorithm, and that algorithm can differ between C implementations, including versions of the same standard library.
  • rand returns values no greater than RAND_MAX, and RAND_MAX can differ between C implementations.

Instead, you should use an implementation of a pseudorandom number generator with a known algorithm, and you should also rely on your own way to transform pseudorandom numbers from that algorithm into the numbers you desire. (For many ways to do so, see my page on sampling algorithms. Note that there are other things to consider when reproducibility is important.)

See also the following:

Peter O.
  • 32,158
  • 14
  • 82
  • 96
0

rand is obsolete in Mac. Use random() instead.

0___________
  • 60,014
  • 4
  • 34
  • 74
  • 2
    Can you explain why (by editing your answer)? Why would it be different between Linux and Mac? Isn't it the same (e.g., GCC) under the covers (not a rhetorical question)? Even if it is obsolete, why would `rand` give a non-random sequence of numbers? – Peter Mortensen May 16 '21 at 12:33