When i request a website with the python module requests, i don't get a webpage which is up to date but a cached website.
As far as i know there should be no caching with requests or am I wrong ?
finanzennet_request = requests.get('http://finanzen.net/aktien/Tesla-Aktie')
print(finanzennet_request)
Yields the following result
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
<!-- CacheEngine generated: 87039 chars in 0,0313 seconds on 26.08.2015 21:39:07 from NT -->
As you can see it says "CacheEngine generated...." . Can it really be that the webserver recognizes that my script is not a real user and therefore only gives me a cached version ? If so how can i avoid it ?