51

I'm trying to run two functions simultaneously in Python. I have tried the below code which uses multiprocessing but when I execute the code, the second function starts only after the first is done.

from multiprocessing import Process
def func1:
     #does something

def func2:
     #does something

if __name__=='__main__':
     p1 = Process(target = func1)
     p1.start()
     p2 = Process(target = func2)
     p2.start()
martineau
  • 119,623
  • 25
  • 170
  • 301
user2739601
  • 515
  • 1
  • 5
  • 6
  • 5
    Are you sure the first one isn't just finishing quickly? Also, make sure the processes are truly independent, not waiting for resources the other is using or data the other will produce. – user2357112 Sep 18 '13 at 06:01
  • Have you ever read this post [How to run two functions simultaneously](http://stackoverflow.com/questions/2108126/how-to-run-two-functions-simultaneously) – zionpi Sep 18 '13 at 06:02
  • @user2357112: first function takes around 2 mins to execute when i run it and both the functions are completely independent. – user2739601 Sep 18 '13 at 06:08
  • Possible duplicate of [Python: How can I run python functions in parallel?](http://stackoverflow.com/questions/7207309/python-how-can-i-run-python-functions-in-parallel) – OrangeDog Jun 30 '16 at 14:20

6 Answers6

69

You are doing it correctly. :)

Try running this silly piece of code:

from multiprocessing import Process
import sys

rocket = 0

def func1():
    global rocket
    print 'start func1'
    while rocket < sys.maxint:
        rocket += 1
    print 'end func1'

def func2():
    global rocket
    print 'start func2'
    while rocket < sys.maxint:
        rocket += 1
    print 'end func2'

if __name__=='__main__':
    p1 = Process(target = func1)
    p1.start()
    p2 = Process(target = func2)
    p2.start()

You will see it print 'start func1' and then 'start func2' and then after a (very) long time you will finally see the functions end. But they will indeed execute simultaneously.

Because processes take a while to start up, you may even see 'start func2' before 'start func1'.

stamaimer
  • 6,227
  • 5
  • 34
  • 55
Shashank
  • 13,713
  • 5
  • 37
  • 63
34

This is just what i needed. I know it wasn't asked but i modified shashank's code to suit Python 3 for anyone else looking :)

from multiprocessing import Process
import sys

rocket = 0

def func1():
    global rocket
    print ('start func1')
    while rocket < sys.maxsize:
        rocket += 1
    print ('end func1')

def func2():
    global rocket
    print ('start func2')
    while rocket < sys.maxsize:
        rocket += 1
    print ('end func2')

if __name__=='__main__':
    p1 = Process(target=func1)
    p1.start()
    p2 = Process(target=func2)
    p2.start()

Substitute sys.maxsize for an number then print(rocket)and you can see it count up one at a time. Get to a number and stop

Alistair Bendall
  • 538
  • 4
  • 10
12

This can be done elegantly with Ray, a system that allows you to easily parallelize and distribute your Python code.

To parallelize your example, you'd need to define your functions with the @ray.remote decorator, and then invoke them with .remote.

import ray

ray.init()

# Define functions you want to execute in parallel using 
# the ray.remote decorator.
@ray.remote
def func1():
    #does something

@ray.remote
def func2():
    #does something

# Execute func1 and func2 in parallel.
ray.get([func1.remote(), func2.remote()])

If func1() and func2() return results, you need to rewrite the code as follows:

ret_id1 = func1.remote()
ret_id2 = func1.remote()
ret1, ret2 = ray.get([ret_id1, ret_id2])

There are a number of advantages of using Ray over the multiprocessing module. In particular, the same code will run on a single machine as well as on a cluster of machines. For more advantages of Ray see this related post.

Ion Stoica
  • 797
  • 9
  • 7
  • 2
    Unfortunately for me, there is no ray distribution for windows. – JediCate Feb 15 '19 at 09:34
  • This library is a piece of genius. Check out the walkthrough here: https://ray.readthedocs.io/en/latest/walkthrough.html. This reduces the overhead of trying to configure the pools and monitor the resources and provides automated methods to use available cpu and gpu resources. HIGHLY recommend! – RandallShanePhD Apr 17 '20 at 17:52
  • Unfortunately, it currently has a problem with class attributes, and so getter/setter methods, too :-( – leonard vertighel Dec 22 '21 at 22:45
8

This is a very good example by @Shashank. I just want to say that I had to add join at the end, or else the two processes were not running simultaneously:

from multiprocessing import Process
import sys

rocket = 0

def func1():
    global rocket
    print 'start func1'
    while rocket < sys.maxint:
        rocket += 1
    print 'end func1'

def func2():
    global rocket
    print 'start func2'
    while rocket < sys.maxint:
        rocket += 1
    print 'end func2'

if __name__=='__main__':
    p1 = Process(target = func1)
    p1.start()
    p2 = Process(target = func2)
    p2.start()
    # This is where I had to add the join() function.
    p1.join()
    p2.join()

Furthermore, Check this thread out: When to call .join() on a process?

TheWalkingData
  • 1,007
  • 1
  • 12
  • 11
  • process.join was the missing part in almost all example script for multiprocessing, without this, the process is running sequentially, thanks for pointing this. – nish Nov 30 '20 at 12:19
  • Suppose func1() requires an input argument. Can you please give an example on how to pass it? – Syed Md Ismail Apr 12 '21 at 14:03
4

Here is another version, if a dynamic list of processes need to be run. I am including the two shell scripts, if you want to try it:

t1.sh

for i in {1..10}
  do 
     echo "1... t.sh i:"$i
     sleep 1
  done

t2.sh

   for i in {1..3}
   do
       echo "2.. t2.sh i:"$i
       sleep 1
   done

np.py

import os
from multiprocessing import Process, Lock

def f(l, cmd):
    os.system(cmd)

if __name__ == '__main__':
    lock = Lock()

    for cmd in ['sh t1.sh', 'sh t2.sh']:
        Process(target=f, args=(lock, cmd)).start()

output

1... t.sh i:1
2.. t2.sh i:1
1... t.sh i:2
2.. t2.sh i:2
1... t.sh i:3
2.. t2.sh i:3
1... t.sh i:4
1... t.sh i:5
1... t.sh i:6
1... t.sh i:7
1... t.sh i:8
1... t.sh i:9
1... t.sh i:10

"lock" left there can be acquired before task "l.acquire()" and released after "l.release()"

Mike
  • 4,041
  • 6
  • 20
  • 37
0
#Try by using threads instead of multiprocessing

#from multiprocessing import Process
#import sys

import time
import threading
import random

rocket = 0

def func1():
    global rocket
    print('start func1')
    while rocket < 100:
        print("Im in func1")
        rocket += 1
        value = "Im global var "+str(rocket)+" from fun1"
        print(value)

    print ('end func1')

def func2():
    global rocket
    print ('start func2')
    while rocket < 100:
        print("Im in func2")
        rocket += 1
        value = "Im global var " + str(rocket) + " from fun2"
        print(value)
    print ('end func2')

if __name__=='__main__':
    p1 = threading.Thread(target=func1)
    p2 = threading.Thread(target=func2)
    p1.start();p2.start()

#Hope it works