I am trying to make a GET
request with the requests library from python. I do not want to skip the request, so I think that a time-out would not help me.
handling the URL in my browser does not cause any problems. When I parse the URL through the request.get()
function, it takes over one minute to process.
start = time.time()
url = 'desired_url'
requests.get(url)
print(f'it took {time.time() - start} seconds to process the request')
this piece of code gives me:
it took 76.72762107849121 seconds to process the request
I am using the following version of requests:
requests==2.21.0
Since I would like to handle thousands of requests, more than a minute for each request is too long.
Any idea what happens here? How can I ensure a faster processing of my requests.get()
?