0

I am trying to implement multiprocessing in my program.
Initially, I wrote this code.

pool = mp.Pool(mp.cpu_count())

for i in range(0, 10000):
    bid = i
    ask = i
    pool.apply_async(function1, args=(bid, ask,))
    pool.apply_async(function2, args=(bid, ask,))
    pool.apply_async(function3, args=(bid, ask,))
    pool.close()
    pool.join()

This gave me an error:

Python ValueError: Pool is still running

So I modified the code to:

for i in range(0, 10000):
    bid = i
    ask = i
    pool = mp.Pool(mp.cpu_count())
    pool.apply_async(function1, args=(bid, ask,))
    pool.apply_async(function2, args=(bid, ask,))
    pool.apply_async(function3, args=(bid, ask,))
    pool.close()
    pool.join()

This doesn't execute at all & shows a blank terminal.

What I'm trying to achieve is for every value in the range I want to run 3 functions in parallel, only after these 3 functions are executed, it should move to the next i value in the range(0,1000).

martineau
  • 119,623
  • 25
  • 170
  • 301
Hash Line
  • 35
  • 7
  • If you want synchronous workflow, why are you using asynchronous methods? – Olvin Roght Nov 25 '21 at 17:15
  • you shall use async io instead of mp – Dariusz Krynicki Nov 25 '21 at 17:21
  • Consider using the multiprocessing module. Refer to [this](https://stackoverflow.com/a/56138825/14973743) answer – Anand Sowmithiran Nov 25 '21 at 17:22
  • You have `pool.close()` inside your loop. Once a pool is closed, you can never send more work to it. Did you mean the `close` and `join` to be outside the loop? You'll have 30000 tasks runnning. – Frank Yellin Nov 25 '21 at 17:27
  • multiprocessing is for cpu bound tasks. threading is for network bound tasks. multiprocessing implements threading under the hood. he shall use async io is he wants to call multiple functions in parallel. – Dariusz Krynicki Nov 25 '21 at 17:34
  • Does this answer your question? [How to run functions in parallel?](https://stackoverflow.com/questions/7207309/how-to-run-functions-in-parallel) – Dariusz Krynicki Nov 25 '21 at 17:39

1 Answers1

1

Here's a pattern that you can adapt:

from concurrent.futures import ProcessPoolExecutor

def func1(a, b):
    pass

def func2(a, b):
    pass

def func3(a, b):
    pass

def main():
    with ProcessPoolExecutor() as executor: # work manager
        for i in range(1_000):
            futures = []
            for func in [func1, func2, func3]:
                futures.append(executor.submit(func, i, i))
            for future in futures:
                future.result() # wait for process to terminate

if __name__ == '__main__':
    main()