0

I have this random number function I created in C:

/************************************************************************

Function:      randomNumber

Description:    Generates a random float number between 0.0 and a given max

*************************************************************************/
float randomNumber(float max) {
    float x = ((float)rand() / (float)(RAND_MAX)) * max;
    return x;
}

if I call it using:

randomNumber(1.0);

I get the number 0.00125125889 every single time.

I have used it many times in the past with other programs/assignments but it is not working with this current program I am writing and I am not sure why. I tried calling srand(time(NULL)) before calling rand() (I never had to do this in my other programs) and it still doesn't work properly. maybe it might be the #defines I am using but I don't see how they would relate to the rand functions.

I would be greatly thankful if someone could help me figure out why it's not working.

EDIT:

it seems to work when it is inside a for loop, why doesn't it work outside?

Kyle Asaff
  • 274
  • 3
  • 13

0 Answers0