2

Is it possible to fire a request and not wait for response at all?

For python, most internet search results in

  1. asynchronous-requests-with-python-requests
  2. grequests
  3. requests-futures

However, all the above solutions spawns a new thread and wait for response on each of the respective threads. Is it possible to not wait for any response at all, anywhere?

Sid-Ant
  • 187
  • 1
  • 11

2 Answers2

4

You can run your thread as a daemon, see the code below; If I comment out the line (t.daemon = True), the code will wait on the threads to finish before exiting. With daemon set to true, it will simply exit. You can try it with the example below.

import requests import threading import time

def get_thread():
    g = requests.get("http://www.google.com")
    time.sleep(2)
    print(g.text[0:100])


if __name__ == '__main__':
    t = threading.Thread(target=get_thread)
    t.daemon = True  # Try commenting this out, running it, and see the difference
    t.start()
    print("Done")
SteveJ
  • 3,034
  • 2
  • 27
  • 47
0

I don't really know what you are trying to achieve by just firing an http request. So I will list some use cases I can think of.

Ignoring the result

If the only thing you want is that your program feels like it never stops for making a request. You can use a library like aiohttp to make concurrent request without actually calling await for the responses.

import aiohttp
import asyncio

async def main():
    async with aiohttp.ClientSession() as session:
        session.get('http://python.org') 

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

but how can you know that the request was made successfully if you don't check anything?

Ignoring the body

Maybe you want to be very performant, and you are worried about loosing time reading the body. In that case yo can just fire the request, check the status code and then close the connection.

def make_request(url = "yahoo.com", timeout= 50):
    conn = http.client.HTTPConnection(url, timeout=timeout)
    conn.request("GET", "/")
    res = conn.getresponse()
    print(res.status)
    conn.close()

If you close the connection as I did previously you won't be able to reuse the connections.

The right way

I would recommend to await on asynchronous calls using aiohttp so you can add the necessary logic without having to block.

But if you are looking for performance, a custom solution with the http library is necessary. Maybe you could also consider very small request/responses, small timeouts and compression in your client and server.

hernan
  • 572
  • 4
  • 10