1

I need to repeat N times a scientific simulation based on a random sampling, easily:

results = [mysimulation() for i in range(N)]

Since every simulation require minutes, I'd like to parallelize them in order to reduce the execution time. Some weeks ago I successfully analyzed some simpler cases, for which I wrote my code in C using OpenMP and functions like rand_r() for avoiding seed overlapping. How could I obtain a similar effect in Python?

I tried reading more about python3 multithreading/parallelization, but I found no results concerning the random generation. Conversely, numpy.random does not suggest anything in this direction (as far as I found).

Peter O.
  • 32,158
  • 14
  • 82
  • 96
  • you could look into hashing, that what we do in realtime computer graphism when we need a lot of random sampling, hash with a specifique value (like thread id + time + i) and you will get some random sampling – Cewein Feb 14 '20 at 11:27
  • Does this answer your question? [Best practices for seeding random and numpy.random in the same program](https://stackoverflow.com/questions/58954557/best-practices-for-seeding-random-and-numpy-random-in-the-same-program) – Peter O. Feb 14 '20 at 12:24
  • See also: https://stackoverflow.com/questions/56009927/is-this-proper-use-of-numpy-seeding-for-parallel-code/56010589#56010589 – Peter O. Feb 14 '20 at 12:29

0 Answers0