0

I am trying to write a function that sends multiple requests at once, and returns the first response. I am currently using a concurrent.futures.ThreadPoolExecutor object. I understand that stopping a request in the middle is complicated, so instead I thought I could keep the other threads in the background and return a value early. However, the function seems to wait for the other threads to finish before returning. How can I prevent that? My code looks like this:

def req(urls):
    with concurrent.futures.ThreadPoolExecutor() as executor:
        futures = []
        for url in urls:
            futures.append(executor.submit(get_request, url))
        for future in concurrent.futures.as_completed(futures):
            if future.result():
                return future.result()  # Other threads should stop now
    return False  # No valid response was sent
One Nose
  • 52
  • 1
  • 7
  • Multithreading for I/O operations is not the way to go in Python due to the [GIL](https://realpython.com/python-gil/). Look into asynchronous I/O via `asyncio`. See [here](https://stackoverflow.com/questions/57126286/fastest-parallel-requests-in-python/57129241#57129241). – felipe Dec 16 '21 at 20:00

1 Answers1

2

Try shutdown https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.Executor.shutdown

def req(urls):
    with concurrent.futures.ThreadPoolExecutor() as executor:
        futures = []
        for url in urls:
            futures.append(executor.submit(get_request, url))
        for future in concurrent.futures.as_completed(futures):
            if future.result():
                executor.shutdown(wait=False, cancel_futures=True)
                return future.result()  # Other threads should stop now
    return False  # No valid response was sent
mama
  • 2,046
  • 1
  • 7
  • 24