I have a custom multi threaded http server(in c++) that:
1)accepts POSTS of media ts segments of different video streams
2)archives a bunch of them into a zip file(on disk)
3) and uploads the zip file to a preconfigured origin/content server(say after 100 segments have been archived to a zip file)
The problem is that each zip file is around ~100 MB and with high number of client POSTs(150 per second), uploading these zip files overwhelms the vsize/rss and crashes.(since uploading requires reading the zip file into memory)
Is there a memory aware /memory efficient way to ensure the upload threads can achieve maximum throughput without overwhelming the memory?
Some kind of dynamic rate limiting perhaps so that too many clients don't shoot up vsize?
Platform is Linux Ubuntu 14.04