I was testing different Python HTTP libraries today and I realized that http.client
library seems to perform much much faster than requests
.
To test it you can run following two code samples.
import http.client
conn = http.client.HTTPConnection("localhost", port=8000)
for i in range(1000):
conn.request("GET", "/")
r1 = conn.getresponse()
body = r1.read()
print(r1.status)
conn.close()
and here is code doing same thing with python-requests:
import requests
with requests.Session() as session:
for i in range(1000):
r = session.get("http://localhost:8000")
print(r.status_code)
If I start SimpleHTTPServer:
> python -m http.server
and run above code samples (I'm using Python 3.5.2). I get following results:
http.client:
0.35user 0.10system 0:00.71elapsed 64%CPU
python-requests:
1.76user 0.10system 0:02.17elapsed 85%CPU
Are my measurements and tests correct? Can you reproduce them too? If yes does anyone know what's going on inside http.client
that make it so much faster? Why is there such big difference in processing time?