I have script that contains heavy CPU-bound function that loops through list and i want it to be using multiprocessing.
script looks like
... do stuff
function_result = my_heavy_function()
... do stuff after i get function_result
After reading some about multithreading and multiprocessing i tried to do
import concurrent.futures
... do stuff
with concurrent.futures.ProcessPoolExecutor() as executor:
executor.map(my_heavy_function, my_list)
... do stuff after i get function_result
But it took ~minute longer than default version of my script which uses no parallel things. (idfk why, bless you if you can explain it to me)
So all i need is to get result of that function asap and then work with this result in 'normal tempo'. Is it possible to use multiprocessing in this way or am i doing something wrong?