When the curl
executable redirects its output to stdout, is its memory-use proportional to the total size of the downloaded content?
If I'm downloading a 1GB tgz with curl
that I pipe to tar
-- as in the example below -- is the memory used proportional to that 1GB?
curl -sSl https://path/to/1GB_file.tar.gz | tar -xvxf - -C /
If the answer is 'yes', it there a way to "throttle" or limit the maximum memory used by curl
without outright causing the download itself to fail?
Is there a way to measure how much memory curl
uses -- either dynamically, or a maximum value -- while downloading a given file?