I keep reading stuff like this
gzip compression of chunked encoding response?
Is there a way to work around this limitation of needing ALL of the content in-memory at once (which significantly impacts my RAM). I am wondering if one of the streaming compression protocols could be used?
I really wonder for browsers for uploading and downloading if they really put the whole thing in memory for a very very large file(I hope they don't). At the same time, I am thinking for large files, this just doesn't work too well yet.
thanks, Dean