Python's urllib.request
module offers a urlopen
function which retrieves a URL's content plus some metadata and stores everything in main memory. In environments with limited memory this can quickly result in MemoryError
s.
There is another function called urlretrieve
which seems to do what I am looking for. For some reason, though, the official documentation mentions that it might become deprecated in the future.
Is there an "official", built-in and non-legacy way for performing a download directly to the local file system? I am aware that this can easily be achieved with third-party libraries such as requests
but I am working under strict computational & memory constraints and therefore would favor a built-in solution.