10

I have written a short monte carlo integration algorithm to calculate an integral in Fortran 90. I once compared the result obtained by solving the integral with respect to some parameter using the intrinsic random number generator with the random number generator method ran1 presented in Numerical Recipes for Fortran90 Volume 2.

Running the same algorithm twice, once calling the intrinsic random_seed(), then always call random_number() and once calling the ran1() method provided in the Numerical Recipe book I obtain as result in principal the same shape but the intrinsic result is a continuous curve in contrast to the ran1 result. In both cases I call the function with random parameters 10,000 times for a parameter value q, add it and then go on to the next q value and call the function 10,000 times etc.

A comparative image of the result can be found here: http://i.imgur.com/gZVFdOP.png

If I increase the number of calls both curves converge. But I was wondering: why does the intrinsic random number generator generate this smoothness? Is it still generally advised to use it or are there are other more advised RNG? I suppose the continuous result is a result of the "less" randomness of the intrinsic number generator.

(I left out the source code as I don't think that there is a lot of input from it. If somebody cares I can hand it in later.)

DomiD
  • 103
  • 1
  • 8
  • Use tag [tag:fotran] for all Fortran questions. Add a version tag if necessary to distinguish. Probably not here BTW, because the same RNG is in later versions, 95, 2003, 2008, 2015... – Vladimir F Героям слава Jul 08 '16 at 09:11
  • Ah okay thank you. I haven't picked up on the later versions of fortran yet (as I read somewhere that they are not compatible with python) and didn't know if they changed the intrinsic random number generator in newer versions. – DomiD Jul 08 '16 at 09:17
  • I can see that there is an accepted answer, however I would like to ask few questions to challenge my own understanding. What do you mean by converge in – innoSPG Jul 08 '16 at 14:12
  • Well what I mean is that the difference between both methods becomes smaller for increasing number of function evaluation calls. In reality the integral shouldn't go to a constant value, but just become smaller and smaller with increasing q-value (the integral I am trying to solve is some scattering formfactor, it should basically go down with q^(-4)). So as I increase the number of function calls more and more of the real curve becomes visible and basically both RNGs come to the same result. I am just wondering why the noise level at too few function call looks so differently. – DomiD Jul 08 '16 at 14:43

4 Answers4

10

There are NO guarantees about the quality of the pseudo random generator in standard Fortran. If you care about some particular quality of implementation for cryptography or science sensitive to random numbers (Monte-Carlo), you should use some library which you have control about.

You can study the manual of your compiler to find out what it says about the random number generator, but every compiler can implement a completely different algorithm to generate random numbers.

Numerical Recipes is actually not well received by some people in the numerical mathematics community http://www.uwyo.edu/buerkle/misc/wnotnr.html

This site is not for software recommendation, but this article (link given by roygvib in a comment): https://arxiv.org/abs/1005.4117 is a good review with examples of bad and good algorithms, methods how to test them, how to generate arbitrary number distributions and examples of calls of two example libraries in C (one of them can be called from Fortran as well).

Personally I use this https://bitbucket.org/LadaF/elmm/src/master/src/rng_par_zig.f90 parallel PRNG, but I didn't test the quality, I personally just need speed. But this is not a software recommendation site.

  • Ah ok I didn't see that! Thank you this makes everything a bit clearer! If anybody else wonders, I used the gfortran version 5.3.1 for the compilation and it uses the KISS algorithm. Probably, I'll step back in this case from using the numerical recipe code and just have a look into the pro and cons of the different PRNG algorithms and run some more tests. Thanks! – DomiD Jul 08 '16 at 09:40
2

The particular random number generator used depends on the compiler. Perhaps documented, perhaps not. Perhaps subject to change. For high quality work, I would use library / source code from elsewhere. A webpage with Fortran implementations of the Mersenne Twister: http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/VERSIONS/FORTRAN/fortran.html. I have used http://theo.phys.sci.hiroshima-u.ac.jp/~ishikawa/PRNG/mt_stream_en.html and verified against the original implementation. This version is useful for multi-threaded programs.

M. S. B.
  • 28,968
  • 2
  • 46
  • 73
  • 1
    Thanks a lot for this wonderful link! Note that the authors (Matsumoto and Nishimura) are the same as the coders of the [c++11 MT standard random number engine](https://en.cppreference.com/w/cpp/numeric/random/mersenne_twister_engine), which is for sure guaranteeing some quality standard. – jmon12 Nov 02 '20 at 17:14
1

I agree with Vladamir F. But...

To help you in your quest for better random numbers, consider the fairly recent addition to C++, C++11 Pseudo Random Number Generators. Mersenne twister and lots of others are there. It's a pretty well-thought out system. I see two ways you could test those sequences:

  • make a system call from Fortran to a little C++ utility that generates a bunch of them for you or
  • bind the random number generators to Fortran through ISO_C_BINDING.

I prefer the second and because the bindings are a little tricky, I have prepared an example. The files are:

  1. cxx11_rand.h and cxx11_rand.cpp which define calls to the random number generator
  2. c_rand.cpp which calls the C++ functions in C functions
  3. f_rand_M.f90 which binds the C functions to Fortran
  4. f_main.f90 which uses the module functions to generate 10 random numbers in [0,1).

cxx11_rand.h

#ifndef CXX11_RAND_H
#define CXX11_RAND_H

void cxx11_init();
double cxx11_rand();

#endif

cxx11_rand.cpp

#include <algorithm>
#include <cstdio>
#include <memory>
#include <random>
#include <set>
#include <vector>
#include <chrono>
#include <thread>
#include <iostream>

#include "cxx11_rand.h"

static std::unique_ptr<std::uniform_real_distribution<>> dist;
static std::unique_ptr<std::mt19937_64> eng;

void cxx11_init(){
    eng = std::unique_ptr<std::mt19937_64>( new std::mt19937_64(std::random_device{}()) );
    dist = std::unique_ptr< std::uniform_real_distribution<> >( new std::uniform_real_distribution<>(0.0,1.0) );
}

double cxx11_rand(){
    return (*dist)( *eng );
}

c_rand.cpp

#include "cxx11_rand.h"

#ifdef __cplusplus
extern "C" {
#endif

void c_init(){
    cxx11_init();
}

double c_rand(){
    return cxx11_rand();
}

#ifdef __cplusplus
}
#endif

f_rand_M.f90

module f_rand_M

    implicit none

    !define fortran interface bindings to C functions
    interface

        subroutine fi_init() bind(C, name="c_init")
        end subroutine

        real(C_DOUBLE) function fi_rand() bind(C, name="c_rand")
            use ISO_C_BINDING, only: C_DOUBLE
        end function

    end interface

contains

    subroutine f_init()
        call fi_init()
    end subroutine

    real(C_DOUBLE) function f_rand()
        use ISO_C_BINDING, only: C_DOUBLE
        f_rand = fi_rand()
    end function

end module

f_main.f90

program main

    use f_rand_M

    implicit none
    integer :: i

    call f_init()
    do i=1,10
        write(*,*)f_rand()
    end do

end program

You can compile/link with the following GNU commands

echo "compiling objects"
g++ -c --std=c++11 cxx11_rand.cpp c_rand.cpp
gfortran -c f_rand_M.f90

echo "building & executing fortran main"
gfortran f_main.f90 f_rand_M.o c_rand.o cxx11_rand.o -lstdc++ -o f_main.exe
./f_main.exe

Your output should look like this (with different random numbers of course--the seed here was chosen from a "source of entropy", e.g. wall time).

compiling objects
building & executing fortran main
  0.47439556226575341
  0.11177335018127127
  0.10417488557661241
  0.77378163596792404
  0.20780793755332663
  0.27951447624366532
  0.66920698086955666
  0.80676663600103105
  0.98028384008440417
  0.88893587108730432

I used GCC 4.9 on a Mac for testing.

wawiesel
  • 170
  • 9
  • 2
    Many of these algorithms have Fortran implementation available. – Vladimir F Героям слава Jun 16 '17 at 22:24
  • I'm not sure that's constructive...most algorithms have implementations in many languages available. But the C++ ones are as close to a "standard" implementation as you can find, and it's not hard to write the bindings. – wawiesel Jun 21 '17 at 01:58
  • He obviously programs in Fortran. I don't want to speak for the OP but when I prrogram in Fortran it is easier for me to call Fortran implemetations (and I DO know how to call C++). And many algorithms are so old that I would call the Fortran version "standard". Also, Fortran is just easier and cleaner language. – Vladimir F Героям слава Jun 21 '17 at 05:57
  • What the OP wanted: a bunch of different PRNG to test using his chosen language of Fortran. My Solution: I know you have g++ if you have gfortran--so copy/paste these bindings into your Fortran code base to access the C++11 random number generators from Fortran. Your Solution: go out and find these algorithms in Fortran, and I agree with you that they must exist. But the time it takes to find them and verify their quality...I think it's non-negligible. But hey, agree to disagree. :) – wawiesel Jun 21 '17 at 21:29
  • Well, I am not sure what he wanted, a bunch of PRNGs? I don't see it in the question. Asking for libraries is off topic on StackOverflow anyway. In my reading he was asking about about the properties of the intrinsic generator and if there are any better available. Not for recommendation of a particular library. – Vladimir F Героям слава Oct 10 '17 at 14:11
0

PRNG is a bad option when you are doing MC and it does not matter in which programming language. For MC simulations it is always good idea to use service like Random ORG or hardware random number generator. The book Effective Java, in Item 47, clearly shows an example of problematic PRNG. With PRNGs you are never sure that you are OK with their internal implementation.

Todor Balabanov
  • 376
  • 3
  • 6
  • 17
  • Massive MC simulations on supercomputers do no use hardware random generators. They use PRNGs and serious scientific work is done with them. Like http://home.thep.lu.se/Pythia/pythia82html/RandomNumbers.html http://fermiqcd.net/fermiqcd/static/doxygen/html/classmdp__prng.html – Vladimir F Героям слава Oct 09 '17 at 08:47
  • If the scientific work is serious PRNGs are not the tool which should be used. The fact that people doing it does not mean that it is correct. Read the Item 47 from the book and you will find out why. It is enough you to hit one case like the described in the book in order to damage your scientific research. Even in Manhattan Project random numbers were very carefully selected in order calculations to be good enough. – Todor Balabanov Oct 09 '17 at 10:59
  • 1
    Manhattan project is the stone age. We have computers now. The fact that there are some bad generators does not mean that all are bad. Many are good and are carefully selected. People DO use them for scientific computations, that is the fact. There is no way how you could do otherwise for large scale supercomputing. – Vladimir F Героям слава Oct 09 '17 at 11:05
  • It is up to the person who is doing the research. If this person is OK to accept the risk for fake results because of the PRNG he/she can go ahead with no worries. – Todor Balabanov Oct 09 '17 at 11:08
  • 2
    This article is probably useful https://arxiv.org/abs/1005.4117 (see page 8 about ran2 vs ran1 or 0, where the OP is using ran1). MT also seems very nice. Btw, I'm wondering what is the example of bad PRNG in "Item 47" (in the Java book)... – roygvib Oct 09 '17 at 11:12
  • @roygvib This is very useful. Todor, see the estimation in there reagrding random.org *"In general, the service costs approximately US$ 1 per 4 million random bits. A large-scale Monte Carlo simulation with 10^12 random numbers would therefore cost US$ 250,000."* The same with hardware generators. Available supercomputers simply do not have them installed. There is no choice to be made. There is simply no other practical option than using PRNGs. – Vladimir F Героям слава Oct 10 '17 at 14:07
  • If your research has systematic error, because of the PRNG problem it does not matter how much you are going to save. If the amount of the random numbers is about 10^12 your calculation will be even more sensitive to PRNG numbers. – Todor Balabanov Oct 11 '17 at 07:05
  • 1
    There is **no other way**, simply NONE. There are no hardware generators in supercomputing centers, or only at few computers. They simply are not available. – Vladimir F Героям слава Oct 12 '17 at 22:00
  • There is no other way only for people who do not want to search for other way. As I said do what you want with the risks you are taking. – Todor Balabanov Oct 13 '17 at 07:10