4

I need this urgently in my Django site, but because of the time constraint, I cannot do any heavy modifications. This is probably the cheapest in-place modification.

If we just focus on either build or run...

  1. Now I get the id back from build (or run).

  2. All the heavy work is now in a separate function.

'

import multiprocessing as mp
def main():
   id = get_build_id(....)
   work = mp.Process(target=heavy_build_fn)
   work.start()
   return id

If I ran this in the shell (I have not tested this on the actual Django app), the terminal will not end completely until work process is done with its job. As a web app, I need to return the id right away. Can I place work on the background without interrupting?

Thanks.

I've read this How do I run another script in Python without waiting for it to finish?, but I want to know other ways to do it, for example, sticking with MP. The Popen solution may not be what I want actually.


import multiprocessing as mp
import time

def build():
    print 'I build things'
    with open('first.txt', 'w+') as f:
        f.write('')

    time.sleep(10)
    with open('myname.txt', 'w+') as f:
        f.write('3')
    return

def main():
    build_p = mp.Process(name='build process', target=build)
    build_p.start()
    build_p.join(2)
    return 18

if __name__ == '__main__':

    v = main()
    print v
    print 'done'

Console:

I build things
18
done
|

and wait

finally

user@user-P5E-VM-DO:~$ python mp3.py
I build things
18
done
user@user-P5E-VM-DO:~$ 
Community
  • 1
  • 1
user1012451
  • 3,343
  • 7
  • 29
  • 33
  • One typical way to handle this is to use a work queue. Where on request the queue is populated and a separate process, possibly kicked off by cron or other scheduler, is used to consume the queue. – monkut Jun 27 '12 at 00:21
  • I've updated my post. If you run that, it will does the hanging until the script is complete. I can see the return value, but i am afraid of hanging. – user1012451 Jun 27 '12 at 01:33

2 Answers2

3

remove the join() and you may have what you want. join() waits for the processes to end before returning.

The value will return before the child process(es) finish, however, your parent process will be alive until the child processes complete. Not sure if that's an issue for you or not.

This code:

import multiprocessing as mp
import time

def build():
    print 'I build things'
    for i in range(10):
        with open('testfile{}.txt'.format(i), 'w+') as f:
            f.write('')
            time.sleep(5)


def main():
    build_p = mp.Process(name='build process', target=build)
    build_p.start()
    return 18

if __name__ == '__main__':

    v = main()
    print v
    print 'done'

Returns:

> python mptest.py
18
done
I build things

If you need to allow the process to end while the child process continues check out the answers here:

Run Process and Don't Wait

Community
  • 1
  • 1
monkut
  • 42,176
  • 24
  • 124
  • 155
  • Thanks. When I remove `join` like you said, it still has to wait for child to end. So it defeats the purpose. So I can actually pass in a function such as `build` into a Popen process? Thanks. – user1012451 Jun 27 '12 at 03:07
  • Right now.. I am able to see the return. But the process `build` is lost. It didn't continue. :( – user1012451 Jun 27 '12 at 04:19
0

No, the easiest way to handle what you want is Probably to use a message broker. Django celery is a great solution. It will let you queue a process and return your vie right to the user. Your process will then be executed in the order it was queued

I believe process opened from Django are tied to the thread they were opened in so your view will wait to return until your process is complete

dm03514
  • 54,664
  • 18
  • 108
  • 145
  • Thanks. The problem is, this is not only a Django issue. Those heavy work are done by non-Django programs. I've tried `thread.start_new_thread`. It ended, but I don't see anything kick off from the heavy work... – user1012451 Jun 27 '12 at 00:23