I tried this simple code, which "randomly" generates 1's or -1's million times and sums them up:
#include<iostream>
#include<stdlib.h>
#include<time.h>
int pos_or_neg()
{
if (rand()%2==1) return 1;
else return -1;
}
int main()
{
int i,s=0;
srand (time(NULL));
for (i=0;i<1000000;i++)
s+=pos_or_neg();
std::cout << s;
}
If the rand() instances were independent, then the result should be normally distributed with standard deviation 1000, but I'm getting results of magnitude hundreds at most. If I generate more numbers --- such as 100 millions --- then this is even more striking and the magnitude even decreases!
Is rand() really that bad or am I doing something wrong? In the first case, what are some good alternatives?