I am facing a very strange behavior of a webpage which I am accessing by python request library. Essentially my code is working ok for one hour or two and a given point stops working.
I was assuming that the server was down and nobody was able to access the webpage, but indeed my browser is still able to access the webpage, meanwhile, I am not any longer able to access using request.
Code is simple:
url = "https://www.rememori.com"
data = requets.get(url)
This code works ok, but at a given point of time stops working and provides the following output:
print(data.text)
<html><head>
<meta http-equiv="content-type" content="text/html;charset=utf-8">
<title>502 Server Error</title>
</head>
<body text=#000000 bgcolor=#ffffff>
<h1>Error: Server Error</h1>
<h2>The server encountered a temporary error and could not complete your request.<p>Please try again in 30 seconds.</h2>
<h2></h2>
</body></html>
However, when accessing the webpage with browser or even android everything goes smooth.
I tried to find a similar error and found this recommendation inside StackOverflow: why url works in browser but not using requests get method and this one: Python request.get fails to get an answer for a url I can open on my browser
Essentially they propose to introduce a header, which I tried and the result is the same
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.76 Safari/537.36', "Upgrade-Insecure-Requests": "1","DNT": "1","Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8","Accept-Language": "en-US,en;q=0.5","Accept-Encoding": "gzip, deflate"}
data = requests.get(url,headers=headers)
Any idea what might be happening?