4

let's say i have something similar to:

def worker(name):
    time.sleep(10)
    print name
    return

thrs = []
for i in range(1000):
    t1 = threading.Thread(target=worker, args=(i,))
    thrs.append(t1)

for t in thrs:
    t.start()

Is there way to specify how many threads can run in parallel? in the above case, all 1000 will run in parallel

ealeon
  • 12,074
  • 24
  • 92
  • 173
  • 2
    Is it `threads.append(t1)`, I think `thrs.append(t1)` right ? – Rahul K P Jul 25 '16 at 05:48
  • I don't understand the question ... you have specified how many threads you want: 1000, if you want another number, can't you change this? – maxymoo Jul 25 '16 at 05:49
  • @maxymoo They're not sure how many are possible – cwahls Jul 25 '16 at 05:49
  • Need manage CPU and RAM processes. And `for t in thrs: t.start()` not parallel function. – dsgdfg Jul 25 '16 at 05:50
  • i have 1000 things that needs to be run. but i want them to run at like 10 at a time or what not. i suppose i can just run then join and run then join until no more is left but i was trying to see if there is something that already does that – ealeon Jul 25 '16 at 05:53
  • @dsgdfg isnt it asynchrous so if inside worker func, it has io bound logic, it would be paralleized, right? – ealeon Jul 25 '16 at 05:54
  • Use a logic can't resolve synchronism , maybe define a `CPU clock based trigger` for solution. Python timing very fuzzy on logical conditions. @ealeon – dsgdfg Jul 25 '16 at 06:06

1 Answers1

5

This can be done using multiprocessing.dummy which provides a threaded version of the multiprocessing api.

from multiprocessing.dummy import Pool

pool = Pool(10)
result = pool.map(worker, range(1000))

In python 3, concurrent.futures.ThreadPoolExecutor usually provides a nicer interface.

donkopotamus
  • 22,114
  • 2
  • 48
  • 60