219

I was wondering if there's any library for asynchronous method calls in Python. It would be great if you could do something like

@async
def longComputation():
    <code>


token = longComputation()
token.registerCallback(callback_function)
# alternative, polling
while not token.finished():
    doSomethingElse()
    if token.finished():
        result = token.result()

Or to call a non-async routine asynchronously

def longComputation()
    <code>

token = asynccall(longComputation())

It would be great to have a more refined strategy as native in the language core. Was this considered?

Benyamin Jafari
  • 27,880
  • 26
  • 135
  • 150
Stefano Borini
  • 138,652
  • 96
  • 297
  • 431
  • 1
    As of Python 3.4: https://docs.python.org/3/library/asyncio.html (there's a backport for 3.3 and shiny new `async` and `await` syntax from 3.5). – jonrsharpe Dec 28 '15 at 11:16
  • There is no callback mechanism, but you can aggregate results in a dictionary and it is based on Python's multiprocessing module. I am sure you can add one more parameter to the decorated function as a callback. https://github.com/alex-sherman/deco. – RajaRaviVarma Jul 20 '16 at 08:14
  • To get started. Official Documentation - https://docs.python.org/3/library/concurrency.html – Adarsh Madrecha Jan 07 '19 at 08:37

14 Answers14

227

Something like:

import threading

thr = threading.Thread(target=foo, args=(), kwargs={})
thr.start() # Will run "foo"
....
thr.is_alive() # Will return whether foo is running currently
....
thr.join() # Will wait till "foo" is done

See the documentation at https://docs.python.org/library/threading.html for more details.

Boris Verkhovskiy
  • 14,854
  • 11
  • 100
  • 103
Drakosha
  • 11,925
  • 4
  • 39
  • 52
  • 1
    yeah, if you just need to do things asynchronously, why dont just use thread? after all thread is light weight than process – kk1957 Mar 21 '13 at 16:09
  • 28
    Important note: the standard implementation (CPython) of threads won't help with compute-bound tasks, due to the "Global Interpreter Lock". See the library doc:[link](http://docs.python.org/2/library/threading.html#module-threading) – solublefish May 26 '13 at 19:56
  • 3
    Is using thread.join() really asynchronous? What if you want to not block a thread (e.g. a UI thread) and not use a ton of resources doing a while loop on it? – Mgamerz Jun 05 '14 at 18:50
  • 1
    @Mgamerz join is synchronous. You you could let the thread to put the results of the execution in some queue, or/and call a callback. Otherwise you do not know when it's done (if at all). – Drakosha Jun 05 '14 at 22:10
  • @Drakosha : In my case, threads, would still involve [too much overhead](http://stackoverflow.com/q/33453581/2284570). – user2284570 Oct 31 '15 at 17:15
  • 1
    Is it possible to call a callback function at the end of the thread execution like you can do with multiprocessing.Pool – Reda Drissi Mar 28 '19 at 13:10
155

You can use the multiprocessing module added in Python 2.6. You can use pools of processes and then get results asynchronously with:

apply_async(func[, args[, kwds[, callback]]])

E.g.:

import time
from multiprocessing import Pool

def postprocess(result):
    print("finished: %s" % result)

def f(x):
    return x*x

if __name__ == '__main__':
    pool = Pool(processes=1)              # Start a worker processes.
    result = pool.apply_async(f, [10], callback=postprocess) # Evaluate "f(10)" asynchronously calling callback when finished.
    print("waiting...")
    time.sleep(1)

This is only one alternative. This module provides lots of facilities to achieve what you want. Also it will be really easy to make a decorator from this.

Community
  • 1
  • 1
Lucas S.
  • 13,391
  • 8
  • 46
  • 46
  • 9
    It's probably worth bearing in mind that this spawns separate processes rather than separate thread within a process. This might some implications. – user47741 Feb 06 '10 at 11:12
  • 7
    Lucas S., your example does not work, unfortunately. The callback function never gets called. – DataGreed Aug 29 '09 at 23:50
  • 15
    This works: result = pool.apply_async(f, [10], callback=finish) – Michael Allan Jackson Nov 18 '11 at 00:52
  • 7
    To truly do anything asynchronously in python requires using the multiprocessing module to spawn new processes. Merely creating new threads is still at the mercy of the Global Interpreter Lock which prevents a python process from doing multiple things at once. – Drahkar Dec 22 '14 at 01:00
  • @LucasS. : Any idea on how do it [for networking without creating for bombs](http://stackoverflow.com/q/33453581/2284570) – user2284570 Oct 31 '15 at 17:14
  • 2
    In case you don't want to spawn a new process while using this solution - change the import to `from multiprocessing.dummy import Pool`. multiprocessing.dummy has the exact same behavior implemented over threads instead of processes – Almog Cohen Mar 27 '16 at 12:28
74

As of Python 3.5, you can use enhanced generators for async functions.

import asyncio
import datetime

Enhanced generator syntax:

@asyncio.coroutine
def display_date(loop):
    end_time = loop.time() + 5.0
    while True:
        print(datetime.datetime.now())
        if (loop.time() + 1.0) >= end_time:
            break
        yield from asyncio.sleep(1)


loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()

New async/await syntax:

async def display_date(loop):
    end_time = loop.time() + 5.0
    while True:
        print(datetime.datetime.now())
        if (loop.time() + 1.0) >= end_time:
            break
        await asyncio.sleep(1)


loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()
camabeh
  • 1,122
  • 12
  • 13
  • 10
    @carnabeh, could you extend that example to include the OP's "def longComputation()" function? Most example use "await asyncio.sleep(1)", but if the longComputation() returns, say, a double, you can't just use "await longComputation()". – Fab May 13 '17 at 02:51
  • 4
    Ten years in the future and this should be the accepted answer now. When you talk about async in python3.5+ what comes to mind should be asyncio and async keyword. – zeh Jun 22 '20 at 01:34
  • This answer uses the "new and shiny" python syntax. This should be the #1 answer now. – msarafzadeh Jul 21 '20 at 13:03
31

It's not in the language core, but a very mature library that does what you want is Twisted. It introduces the Deferred object, which you can attach callbacks or error handlers ("errbacks") to. A Deferred is basically a "promise" that a function will have a result eventually.

Ron Klein
  • 9,178
  • 9
  • 55
  • 88
Meredith L. Patterson
  • 4,853
  • 29
  • 30
  • 1
    In particular, look at twisted.internet.defer (http://twistedmatrix.com/documents/8.2.0/api/twisted.internet.defer.html). – Nicholas Riley Oct 31 '09 at 20:56
23

You can implement a decorator to make your functions asynchronous, though that's a bit tricky. The multiprocessing module is full of little quirks and seemingly arbitrary restrictions – all the more reason to encapsulate it behind a friendly interface, though.

from inspect import getmodule
from multiprocessing import Pool


def async(decorated):
    r'''Wraps a top-level function around an asynchronous dispatcher.

        when the decorated function is called, a task is submitted to a
        process pool, and a future object is returned, providing access to an
        eventual return value.

        The future object has a blocking get() method to access the task
        result: it will return immediately if the job is already done, or block
        until it completes.

        This decorator won't work on methods, due to limitations in Python's
        pickling machinery (in principle methods could be made pickleable, but
        good luck on that).
    '''
    # Keeps the original function visible from the module global namespace,
    # under a name consistent to its __name__ attribute. This is necessary for
    # the multiprocessing pickling machinery to work properly.
    module = getmodule(decorated)
    decorated.__name__ += '_original'
    setattr(module, decorated.__name__, decorated)

    def send(*args, **opts):
        return async.pool.apply_async(decorated, args, opts)

    return send

The code below illustrates usage of the decorator:

@async
def printsum(uid, values):
    summed = 0
    for value in values:
        summed += value

    print("Worker %i: sum value is %i" % (uid, summed))

    return (uid, summed)


if __name__ == '__main__':
    from random import sample

    # The process pool must be created inside __main__.
    async.pool = Pool(4)

    p = range(0, 1000)
    results = []
    for i in range(4):
        result = printsum(i, sample(p, 100))
        results.append(result)

    for result in results:
        print("Worker %i: sum value is %i" % result.get())

In a real-world case I would ellaborate a bit more on the decorator, providing some way to turn it off for debugging (while keeping the future interface in place), or maybe a facility for dealing with exceptions; but I think this demonstrates the principle well enough.

xperroni
  • 2,606
  • 1
  • 23
  • 29
21

Just

import threading, time

def f():
    print "f started"
    time.sleep(3)
    print "f finished"

threading.Thread(target=f).start()
tc.
  • 33,468
  • 5
  • 78
  • 96
Antigluk
  • 1,176
  • 2
  • 11
  • 30
11

My solution is:

import threading

class TimeoutError(RuntimeError):
    pass

class AsyncCall(object):
    def __init__(self, fnc, callback = None):
        self.Callable = fnc
        self.Callback = callback

    def __call__(self, *args, **kwargs):
        self.Thread = threading.Thread(target = self.run, name = self.Callable.__name__, args = args, kwargs = kwargs)
        self.Thread.start()
        return self

    def wait(self, timeout = None):
        self.Thread.join(timeout)
        if self.Thread.isAlive():
            raise TimeoutError()
        else:
            return self.Result

    def run(self, *args, **kwargs):
        self.Result = self.Callable(*args, **kwargs)
        if self.Callback:
            self.Callback(self.Result)

class AsyncMethod(object):
    def __init__(self, fnc, callback=None):
        self.Callable = fnc
        self.Callback = callback

    def __call__(self, *args, **kwargs):
        return AsyncCall(self.Callable, self.Callback)(*args, **kwargs)

def Async(fnc = None, callback = None):
    if fnc == None:
        def AddAsyncCallback(fnc):
            return AsyncMethod(fnc, callback)
        return AddAsyncCallback
    else:
        return AsyncMethod(fnc, callback)

And works exactly as requested:

@Async
def fnc():
    pass
Nevyn
  • 111
  • 1
  • 2
  • That's an elegant way. Does it work in say Python 2? If it does, it almost is like `async` keyword sweetness without its requirements – FLAW Mar 26 '22 at 18:18
8

You could use eventlet. It lets you write what appears to be synchronous code, but have it operate asynchronously over the network.

Here's an example of a super minimal crawler:

urls = ["http://www.google.com/intl/en_ALL/images/logo.gif",
     "https://wiki.secondlife.com/w/images/secondlife.jpg",
     "http://us.i1.yimg.com/us.yimg.com/i/ww/beta/y3.gif"]

import eventlet
from eventlet.green import urllib2

def fetch(url):

  return urllib2.urlopen(url).read()

pool = eventlet.GreenPool()

for body in pool.imap(fetch, urls):
  print "got body", len(body)
Raj
  • 3,791
  • 5
  • 43
  • 56
6

The newer asyncio running method in Python 3.7 and later is using asyncio.run() instead of creating loop and calling loop.run_until_complete() as well as closing it:

import asyncio
import datetime

async def display_date(delay):
    loop = asyncio.get_running_loop()
    end_time = loop.time() + delay
    while True:
        print("Blocking...", datetime.datetime.now())
        await asyncio.sleep(1)
        if loop.time() > end_time:
            print("Done.")
            break


asyncio.run(display_date(5))
Benyamin Jafari
  • 27,880
  • 26
  • 135
  • 150
5

Something like this works for me, you can then call the function, and it will dispatch itself onto a new thread.

from thread import start_new_thread

def dowork(asynchronous=True):
    if asynchronous:
        args = (False)
        start_new_thread(dowork,args) #Call itself on a new thread.
    else:
        while True:
            #do something...
            time.sleep(60) #sleep for a minute
    return
Nicholas Hamilton
  • 10,044
  • 6
  • 57
  • 88
4

You can use concurrent.futures (added in Python 3.2).

import time
from concurrent.futures import ThreadPoolExecutor


def long_computation(duration):
    for x in range(0, duration):
        print(x)
        time.sleep(1)
    return duration * 2


print('Use polling')
with ThreadPoolExecutor(max_workers=1) as executor:
    future = executor.submit(long_computation, 5)
    while not future.done():
        print('waiting...')
        time.sleep(0.5)

    print(future.result())

print('Use callback')
executor = ThreadPoolExecutor(max_workers=1)
future = executor.submit(long_computation, 5)
future.add_done_callback(lambda f: print(f.result()))

print('waiting for callback')

executor.shutdown(False)  # non-blocking

print('shutdown invoked')
Big Pumpkin
  • 3,907
  • 1
  • 27
  • 18
  • This is a very great answer, as it is the only one here that gives the possibility of a threadpool with callbacks – Reda Drissi Mar 29 '19 at 14:22
  • Unfortunately, this suffers also from the "Global Interpreter Lock". See the library doc: [link](https://docs.python.org/2/library/threading.html#module-threading). Tested with Python 3.7 – Alex Jan 25 '20 at 08:29
  • is this a blocking async call – Golden Lion Aug 19 '21 at 16:42
3

The native Python way for asynchronous calls in 2021 with Python 3.9 suitable also for Jupyter / Ipython Kernel

Camabeh's answer is the way to go since Python 3.3.

async def display_date(loop):
    end_time = loop.time() + 5.0
    while True:
        print(datetime.datetime.now())
        if (loop.time() + 1.0) >= end_time:
            break
        await asyncio.sleep(1)


loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()

This will work in Jupyter Notebook / Jupyter Lab but throw an error:

RuntimeError: This event loop is already running

Due to Ipython's usage of event loops we need something called nested asynchronous loops which is not yet implemented in Python. Luckily there is nest_asyncio to deal with the issue. All you need to do is:

!pip install nest_asyncio # use ! within Jupyter Notebook, else pip install in shell
import nest_asyncio
nest_asyncio.apply()

(Based on this thread)

Only when you call loop.close() it throws another error as it probably refers to Ipython's main loop.

RuntimeError: Cannot close a running event loop

I'll update this answer as soon as someone answered to this github issue.

do-me
  • 1,600
  • 1
  • 10
  • 16
2

Is there any reason not to use threads? You can use the threading class. Instead of finished() function use the isAlive(). The result() function could join() the thread and retrieve the result. And, if you can, override the run() and __init__ functions to call the function specified in the constructor and save the value somewhere to the instance of the class.

Aminah Nuraini
  • 18,120
  • 8
  • 90
  • 108
ondra
  • 9,122
  • 1
  • 25
  • 34
  • 2
    If it's a computationally expensive function threading won't get you anything (it will probably make things slower actually) since a Python process is limited to one CPU core due to the GIL. –  Sep 23 '09 at 21:01
  • 2
    @Kurt, while that's true, the OP didn't mention that performance was his concern. There are other reasons for wanting asynchronous behaviour... – Peter Hansen Dec 14 '09 at 13:24
  • Threads in python aren't great when you want to have the option of killing the asynchronous method call, since only the main thread in python receives signals. – CivFan Jan 20 '16 at 17:43
0

You can use process. If you want to run it forever use while (like networking) in you function:

from multiprocessing import Process
def foo():
    while 1:
        # Do something

p = Process(target = foo)
p.start()

if you just want to run it one time, do like that:

from multiprocessing import Process
def foo():
    # Do something

p = Process(target = foo)
p.start()
p.join()
Keivan
  • 1,300
  • 1
  • 16
  • 29