I’m working on a Python app that needs to send hundreds of HTTP requests in a short period of time to a web service that is SSL-only. Is there any support for HTTP Pipelining in Python?
Asked
Active
Viewed 1,526 times
3
-
1Have you already tried sending these requests concurrently and found it's too slow? [Here](http://stackoverflow.com/a/9010299/95735) is the list of questions regarding concurrent http requests in Python. – Piotr Dobrogost Aug 30 '12 at 20:13
-
`requests` looks promising, I’ll try it out. Thanks! – andrewdotn Aug 30 '12 at 23:49
2 Answers
2
I was able to solve this using Python's grequests module. As documented in issue 13, it's as simple as:
s = requests.session()
rs = [grequests.get(url, session=s) for url in urls]
grequests.map(rs)

andrewdotn
- 32,721
- 10
- 101
- 130
-
1Does this actually do HTTP pipelining as originally requested? Or just separate async requests, each of which incurrs lots of connection latency, setup and overhead? – nealmcb Nov 21 '13 at 21:05
-
@nealmcb Things may have changed in the year since I poked at this, but I believe that while requests didn’t do real pipelining in the sense of sending a ton of requests all at once on one HTTP connection, it did reuse HTTP connections, and it was really really fast so that parsing and persisting the responses became the bottleneck. – andrewdotn Nov 21 '13 at 22:13
-
Ahh - I was thinking of keep-alive, which is typically on by default. Given requests.session(), this does seem on the right track, though I haven't tried it. – nealmcb Nov 21 '13 at 22:23
-
Yes, though it's good you found a satisfactory solution to your own original problem, this doesn't really do much for those of us who are actually looking for real HTTP pipelining. – Mumbleskates Nov 22 '15 at 09:13