I have a link to a file, for example: https://example.com/video.mp4
,
I'm trying to download video.mp4
.
The code I currently have (from here):
with requests.get(URL, cookies=req_cookies, stream=True) as r:
r.raise_for_status()
with open(local_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=8192):
f.write(chunk)
After debugging, the program gets stuck on requests.get() while it downloads the file to RAM memory, then it copies from RAM to a file on SSD.
I want to load the request.get() response directly to a file, instead of downloading to RAM then coping to a file for the following reasons:
- I download large files 1GB<
- I don't want Python to use so much Memory
- I'm using threads to download multiple files in parallel, and the RAM may fill up
I don't mind to use a different library that can do it, all I need is support for loading cookies.