114

I create POST requests with requests as follows, with a specified timeout threshold:

response = requests.post(url, data=post_fields, timeout=timeout)

However, to determine a "good" threshold, I need to benchmark the server response times.

How do I compute the minimum and maximum response times for the server?

Shuzheng
  • 11,288
  • 20
  • 88
  • 186
  • check out `contextmanager` you can wrap it around a simple function which takes the elapsed time instead of having to write a whole new decorator – gold_cy Apr 06 '17 at 11:04

2 Answers2

240

The Response object returned by requests.post() (and requests.get() etc.) has a property called elapsed, which provides the time delta between the Request was sent and the Response was received. To get the delta in seconds, use the total_seconds() method:

response = requests.post(url, data=post_fields, timeout=timeout)
print(response.elapsed.total_seconds())

Note that requests.post() is a synchronous operation, which means that it blocks until the Response is received.

Nicolas Lykke Iversen
  • 3,660
  • 1
  • 11
  • 9
  • 22
    Just want to add `elapsed` is available on any `Response`, not just responses from `POST` requests. – gdvalderrama Feb 20 '19 at 17:02
  • 25
    Notice that this won't get you the time it takes to download the response from the server, but only the time it takes until you get the return headers without the response contents. If you want the elapsed time to include the time it takes to download the response you'll have to use time.clock() – DanyAlejandro May 21 '19 at 20:24
  • However, it disappoints that `elapsed` is not available when an error occurs, including ordinary read timeouts, when the ability to measure execution time would be a reasonable expectation. – mirekphd Dec 12 '21 at 13:35
27

It depends on whether you can hit the server with a lot of test requests, or whether you need to wait for real requests to occur.

If you need real request data, then you'd need to wrap the call to determine the time of each request:

start = time.perf_counter()
response = requests.post(url, data=post_fields, timeout=timeout)
request_time = time.perf_counter() - start
self.logger.info("Request completed in {0:.0f}ms".format(request_time))
#store request_time in persistent data store

You'd need somewhere to store the results of each request over a period of time (file, database, etc). Then you can just calculate the stats of the response times.

If you have a test server available, you could benchmark the response without python using something like apachebench and sending test data for each request:

https://gist.github.com/kelvinn/6a1c51b8976acf25bd78

Ashish Gupta
  • 14,869
  • 20
  • 75
  • 134
Daniel Scott
  • 7,418
  • 5
  • 39
  • 58
  • 2
    The `time.clock()` function has been removed in Python 3.8. The preferred replacement appears to be `time.perf_counter()`. – Claus Conrad Mar 14 '22 at 08:49