0

I wonder if it's possible to spawn the multiple processes using subprocess module to run a function or a method defined in a same script (without a need to import it). So the main script is not waiting for execution to be completed. Like so (the code is wrong but it illustrate the concept):

def printMe(arg):
    print arg

myList=['One','Two','Three','Four','Five']

for word in myList:
    printMe(word)
    proc = subprocess.Popen(printMe(word), stdout=subprocess.PIPE, stderr=subprocess.PIPE)

EDITED:

Thanks for the comments! Apparently multiprocessing module needs to be used when there is a need to spawn an internal method or function. It appears the multiprocessing module method pool.map() behaves quite differently from a "standard" function when it is used to send an argument variable to a called by it function.

Example

import os, sys
from multiprocessing import Pool

def printMe(arg):
    arg+="_Completed"
    return arg
        
myList=['One','Two','Three']

pool = Pool(processes=10) 
results = pool.map(printMe, myList)    

print results, type(results), len(results)

# Results to ['One_Completed', 'Two_Completed', 'Three_Completed'] <type 'list'> 3

SingleWord="Once_Upon_A_Time"

pool = Pool(processes=10) 
results = pool.map(printMe, SingleWord)

# Results to: ['O_Completed', 'n_Completed', 'c_Completed', 'e_Completed', `'__Completed', 'U_Completed', 'p_Completed', 'o_Completed', 'n_Completed', '__Completed', 'A_Completed', '__Completed', 'T_Completed', 'i_Completed', 'm_Completed', 'e_Completed'] <type 'list'> 16`
Community
  • 1
  • 1
alphanumeric
  • 17,967
  • 64
  • 244
  • 392
  • 1
    Use the [`multiprocessing`](http://docs.python.org/2/library/multiprocessing.html) module. You'll have to take care of passing around the data you want to process, though, so in your example you'll have to change `printMe`. – univerio Mar 01 '14 at 01:19
  • Or see this http://stackoverflow.com/questions/1190206/threading-in-python question – Gary Walker Mar 01 '14 at 01:22

2 Answers2

4

You can use multiprocessing and not necessary with Pool.

import multiprocessing

def worker():
    """worker function"""
    print 'Worker'
    return

if __name__ == '__main__':
    jobs = []
    for i in range(5):
        p = multiprocessing.Process(target=worker)
        jobs.append(p)
        p.start()
lpoignant
  • 106
  • 2
2

That's why multiprocessing become a standard lib.

from multiprocessing import Pool

def run(*args):
    # this is the function to be run
    return sum(*args)

if __name__ == "__main__":
    pool = Pool(processes=10) # 10 processes
    results = pool.map(run, [(1, 1, 1), (2, 2, 2), (3, 3, 3)])
    print(results)

@Spuntnix As for your update. pool.map actually expect the second argument to be a iterable. So if you give it a string, it will iterate over the string and send each characters as argument.

Personally I'd like str not iterable. See also: https://mail.python.org/pipermail/python-3000/2006-April/000759.html

yegle
  • 5,795
  • 6
  • 39
  • 61
  • Thanks for clarification! I've just tested pool.map() to spawn an internal method (function) in my code. Everything run well. But the execution stalls (stops, pauses) waiting for those spawned processes to finish. I thought the main idea behind of spawning is to make sure the program won't be waiting for the processes to be completed. – alphanumeric Mar 01 '14 at 02:21
  • @Sputnix Use `pool.map_async` if that's what you want. – yegle Mar 01 '14 at 03:10
  • Yes! map_async was exactly what I needed! myProcess = pool.map_async( myFunction, myArgList, callback=results.append ) – alphanumeric Mar 01 '14 at 03:13