2

I have this minimal example:

from functools import wraps
from concurrent import futures
import random

def decorator(func):
    num_process = 4

    def impl(*args, **kwargs):
        with futures.ProcessPoolExecutor() as executor:
            fs = []
            for i in range(num_process):
                fut = executor.submit(func, *args, **kwargs)
                fs.append(fut)
            result = []
            for f in futures.as_completed(fs):
                result.append(f.result())
        return result
    return impl

@decorator
def get_random_int():
    return random.randint(0, 100)


if __name__ == "__main__":
    result = get_random_int()
    print(result)

If we try to run this function I think we will have the following error:

_pickle.PicklingError: Can't pickle <function get_random_int at 0x7f06cee666a8>: it's not the same object as __main__.get_random_int

I think the main issue here is that the "wraps" decorator itself alters the func object and thus make it impossible to pickle. I found this rather strange. I am just wondering if there is any way to get around this behavior? I would want to use wraps if possible. Thanks!

Bob Fang
  • 41
  • 2
  • Agree. I've seen a similar issue. It complains about the pickling. Wondering if anyone has a way to fix this. – broccoli Oct 17 '20 at 15:46

1 Answers1

0

This is because run_in_executor is calling functools.partial on the decorated function see: https://docs.python.org/3/library/asyncio-eventloop.html#asyncio-pass-keywords The picklability of partial objects is spotty (see: Are partial functions "officially" picklable?) but See this comment over here Pickling wrapped partial functions partial functions are only pickleable when the function being pickled is in the global name sapce. We know run_in_executor with a ProcessPoolExecutor will work for non wrapped functions since that pattern is documented in asyncio. To get around this I decorate a dummy function and pass the function I want to be executed in multiple processes as an argument to the decorator

from functools import wraps
from concurrent import futures
import random

def decorator(multiprocess_func):
    def _decorate(func):
        num_process = 4

        def impl(*args, **kwargs):
            with futures.ProcessPoolExecutor() as executor:
                fs = []
                for i in range(num_process):
                    fut = executor.submit(multiprocess_func, *args, **kwargs)
                    fs.append(fut)
                result = []
                for f in futures.as_completed(fs):
                    result.append(f.result())
            return result
        return impl
    return _decorate

def _get_random_int():
    return random.randint(0, 100)

@decorator(_get_random_int)
def get_random_int():
    return _get_random_int()


if __name__ == "__main__":
    result = get_random_int()
    print(result)

I ultimately decided that not using a decorator was cleaner

from concurrent import futures
import random

def decorator(multiprocess_func):
    num_process = 4

    def impl(*args, **kwargs):
        with futures.ProcessPoolExecutor(max_workers=num_process) as executor:
            fs = []
            for i in range(num_process):
                fut = executor.submit(multiprocess_func, *args, **kwargs)
                fs.append(fut)
            result = []
            for f in futures.as_completed(fs):
                result.append(f.result())
        return result
    return impl

def _get_random_int():
    return random.randint(0, 100)

get_random_int = decorator(_get_random_int)


if __name__ == "__main__":
    result = get_random_int()
    print(result)

Similar to the linked answer above about pickling wrapped partial functions.

Cappie
  • 41
  • 5