2

I am using Python v 2.2.1 and trying to write a code which needs to perform some tasks in parallel inside a for loop. The number of iterations is not fixed and depends on the list (serverList, in this case). How can I achieve this through multithreading or multiprocessing. I read through some examples but they were calling different functions in parallel. Here I need to call the same function (shutdownTask), but with different arguments. Part of the code is given below:

try:
    <some code>
    for server in serverList:
        print ('Shutting down server: '+server+'...')
        shutdownTask(server, serverType)
finally:
    verifyStatus()
jfs
  • 399,953
  • 195
  • 994
  • 1,670
KnockTurnAl
  • 147
  • 1
  • 4
  • Have you tried [Pykka](https://www.pykka.org/en/latest/) ? – Vaulstein Jun 29 '15 at 07:30
  • 3
    Python v 2.2.1? Really? Because that is ancient. –  Jun 29 '15 at 07:32
  • 3
    Any reason you are using a 13 year old version of python? – skaz Jun 29 '15 at 07:32
  • 2
    Looking at the print function (and its parentheses), you might have meant Python v 3.2.1. Though I would still recommend to upgrade to the current version (3.4). –  Jun 29 '15 at 07:33
  • The [introduction on Pythons multiprocessing](https://docs.python.org/3.4/library/multiprocessing.html) gives a very simple example of calling the same function in parallel with different arguments. You can adapt that to your needs. –  Jun 29 '15 at 07:35
  • follow [pep-8 naming conventions](https://www.python.org/dev/peps/pep-0008/#naming-conventions) unless you have a *specific* reason not to. – jfs Jun 29 '15 at 08:26
  • ITt is 2.2.1 and yes i get that a lot.. it is part of the product so i cannot upgrade it and cannot install additional frameworks in this. So i was looking for something which comes built-in – KnockTurnAl Jun 29 '15 at 08:34

2 Answers2

0

I recommend you Celery. You could launch one task per server in background.

To create a task you decorate your function using the @task decorator.

To run a task you use the delay() method added to the decorated task function.

Note: I don't know if there may be compatibility issues if you are really using the very old Python version 2.2.1

UPDATE

If you don't need such a complex solution, then you could go with Thread class.

Caumons
  • 9,341
  • 14
  • 68
  • 82
  • Why install the whole Celery framework when possibly the built-in multiprocessing module can do this? –  Jun 29 '15 at 07:36
  • If you are creating quite a big system, Celery is a good choice. You can monitor the state of your tasks and manage them in a quite easy way. However, if it's a simple task and you don't require any management you could go through `threading` – Caumons Jun 29 '15 at 07:37
  • I don't see any indication of a big system in the question, so I'd start small. –  Jun 29 '15 at 07:42
-1

You could use a process (or a thread) pool (assuming you meant Python 3.2 version):

from multiprocessing import Pool

def shutdown(server):
    try:
        shutdownTask(server, serverType)
    except Exception as e:
        return server, str(e)
    else:
        return server, None

try:
    <some code>
    pool = Pool()
    for server, error in pool.imap_unordered(shutdown, serverList):
        if error is None:
            print('Done shutting down server:', server, '...')
        else:
            print('Error shutting down server:', server, '...', error)
finally:
    verifyStatus()
jfs
  • 399,953
  • 195
  • 994
  • 1,670