I am running analyses on very large graphs with random weights. I know that using the rand()
function is poor, so I have been using this function based on what I have read to be proper random number generation:
double randomZeroToOne()
{
random_device rd;
mt19937 mt(rd());
uniform_real_distribution<double> dist(0.0, 1.0);
double randomRealBetweenZeroAndOne = dist(mt);
return randomRealBetweenZeroAndOne;
}
Everytime I want to put a weight in my graph, I call this function and insert it into my adjacency matrix. However, I am worried that perhaps I am using a slow method to generate these numbers. This has worked fine for small graphs, but my larger graphs are very slow (likely something else is slowing it down, but I just wanted to double check and learn the proper way). What is the best way to preserve the quality of the numbers but to generate them as quickly as possible?
Additionally, if you know a way to initialize a vector of a known size with fast, good, uniformly distributed random numbers, that would be even better (although I am still curious about the answer to my main question).
EDIT:
This is my new proposed solution:
#include <iostream>
#include <random>
#include<vector>
#include<iomanip>
#include<cmath>
using namespace std;
random_device rd;
mt19937 mt(rd());
uniform_real_distribution<double> dist(0.0, 1.0);
int main()
{
double randomRealBetweenZeroAndOne = dist(mt);
double anotherRandomRealBetweenZeroAndOne = dist(mt);
double anothernother = dist(mt);
vector<double> randoman(10,dist(mt));
cout << randomRealBetweenZeroAndOne << endl;
cout << anotherRandomRealBetweenZeroAndOne <<endl;
cout << anothernother <<endl;
}
Please let me know if you see any issues with this, especially if this function will be called many times.