There are a few different ways to solve this. The simplest one is to use a multiprocessing.Pool
and the apply_async
function:
from multiprocessing import Pool
def func1():
x = 2
return x
def func2():
y = 1
return y
def func3():
z = 5
return z
if __name__ == '__main__':
with Pool(processes=3) as pool:
r1 = pool.apply_async(func1, ())
r2 = pool.apply_async(func2, ())
r3 = pool.apply_async(func3, ())
print(r1.get(timeout=1))
print(r2.get(timeout=1))
print(r3.get(timeout=1))
The multiprocessing.Pool
is a rahter helpful construct that takes care of the underlying communication between processes, by setting up pipes and queues and what else is needed. The most common use case is to use it together with different data to the same function (distributing the work) using the .map
function. However, it can also be used for different functions, by e.g. the .apply_async
construct like I am doing here.
This, however, does not work from the interpreter but must be stored as as .py
file and run using python filename.py
.