i have problem with streaming download large file (about 1.5 GB) in python-requests v. 2.0.1
with open("saved.rar",'wb') as file:
r = session.get(url,stream=True,timeout=3600)
for chunk in r.iter_content(chunk_size=1024):
if chunk:
file.write(chunk)
file.flush()
I tested it few times on my vps and sometimes it downloaded 200mb, 500mb or 800mb and saved it without any error. It doesnt reached the timeout, just stopped like finish downloading.
Host where im downloading this file is stable because i dont have any problems to download this file in browser.
There is any way to be download large file in python-requests and be 100% sure its whole file?
@Edit
I've solved it using urllib, problem is only with requests. anyway thanks for help.