I have a program that runs over 1000000 iterations to simulate server load. Arrival rate of requests is a variable. For example if arrival rate is 2: This means at every 2 iterations, there should be 1 request incoming which would generate "around" 500,000 requests at the end of simulation and so on. I can not do this just by introducing a new request at each nth interval depending on the arrival rate. There must be a factor of luck involved.
#include<stdio.h>
#include<time.h>
#include <stdlib.h>
//random number generator method
int random_number(int min_num, int max_num){
int result=0,low_num=0,hi_num=0;
if(min_num<max_num){
low_num=min_num;
hi_num=max_num+1; // this is done to include max_num in output.
}else{
low_num=max_num+1;// this is done to include max_num in output.
hi_num=min_num;
}
result = (rand()%(hi_num-low_num))+low_num;
return result;
}
int main(){
srand(time(NULL));
unsigned int arrivalRate = 2;
unsigned int noOfRequests = 0;
unsigned int timer;
for(timer = 0; timer < 1000000; timer++){
//gives a random number between 0 and arrival rate
int x = random_number(0, arrivalRate);
//there is a new request
if(x <= 1){
noOfRequests++;
}
}
printf("No of requests: %d", noOfRequests);
}
So, if I run this code with arrivalRate 2, it generates around 600,000 requests which should be only around 500,000 (+-1000 is tolerable) requests. How can I improve my code to generate more reasonable results, it is producing way too much than expected.