0

I have written this sample code that generate random numbers using multiple threads.

I can print each random number without any problems. But what I am trying to achieve is, I want to collect all the returned integers into generated_numbers array.

from random import randint
from threading import Thread
import time


def generate_random_number(n):
    time.sleep(randint(0, 99)*0.01)
    random_number = randint(0, 99)
    print(str(n) + 'th random number: ' + str(random_number))
    return random_number


generated_numbers = []

for i in range(1, 100):
    thread = Thread(target=generate_random_number, args=(i,))
    thread.start()
nipunasudha
  • 2,427
  • 2
  • 21
  • 46

2 Answers2

2

I would suggest using multiprocessing instead of threading, with the Pool class you can map a function generate_random_number to an iterable range(1,100).

Here is an example that does what you want:

N.B. 100 threads is a little overkill for such a small iterable.

import multiprocessing
from random import randint


def generate_random_number(n):
    random_number = randint(0, 99)
    return f'{n}th random number: {random_number}'


if __name__ == '__main__':
    pool = multiprocessing.Pool(100)
    generated_numbers = pool.map(generate_random_number, range(100))
    print(generated_numbers)
bphi
  • 3,115
  • 3
  • 23
  • 36
0

Do something like

from random import randint
from threading import Thread
import time


def generate_random_number(n):
    global generated_numbers
    time.sleep(randint(0, 99)*0.01)
    random_number = randint(0, 99)
    print(str(n) + 'th random number: ' + str(random_number))
    generated_numbers.append(random_number)
    return random_number


generated_numbers = []

threads = [Thread(target=generate_random_number, args=(i,)) for i in range(100)]
for thread in threads:
    thread.start()

[t.join() for t in threads]
print generated_numbers

t.join() will wait for thread to execute, we are doing this by using list comprehension as thread.join() would resolve once last thread has completed which could happen before all of the started threads run.

Dusan Gligoric
  • 582
  • 3
  • 21
  • I am not so sure about this solution, it works. but is modifying a global list from multiple threads safe? – nipunasudha Dec 12 '17 at 14:21
  • Not sure if its safe to be honest, alternative is using queue with multitheading @nipunasudha – Dusan Gligoric Dec 12 '17 at 14:31
  • I guess it depends. In CPython it would be safe because of the global interpreter lock (GIL) which runs only one thread at a time. If you are using multiple processes which can run code in parallel then it won't be of course. – etaloof Dec 12 '17 at 15:33