0

I'm trying to send requests to a microservice running on localhost. I'm using requests.session() for the python client because I want to keep the connection open for more requests. I measured the performance, the overhead, and it's quite slow ~ 4ms. I tried to send the same request from other clients (JS for example) and it was much faster (0.5 ms). Any ideas ?

The code is very simple-

session_holder = requests.Session() 
session_holder.post(url, json= data, headers=headers)
  • It might be helpful if you add some sample code or other additional information. Your description is a bit vague, it's hard to tell if there is a problem, or what exactly it is. Have you tried without the session? What was the performance then? Are you sending a handful of requests or many? Do you need asynchronicity? – pdowling Oct 31 '18 at 10:13
  • I tried without the session() as well, it was worse. it's the same time if I send one request (no asynchronicity is needed yet). – Gita Ferber Oct 31 '18 at 10:34
  • According to this [question](https://stackoverflow.com/questions/36087637/how-often-does-python-requests-perform-dns-queries) requests/urllib3 is always doing a DNS lookup, maybe thats why. – Maurice Meyer Oct 31 '18 at 11:15

1 Answers1

1

You can try using urllib instead of requests, since requests is a higher level http client interface it will have some overhead.
You can also try using aiohttp which leverages async.
Here is a great tutorial:Intro To asyncio

Dani G
  • 1,202
  • 9
  • 16