NOTE: This is not a duplicate, rate limiting uploads and downloads are totally different and many libs I've seen can rate limit downloads but not uploads. This should be reopened.
I'm running a large file upload (2.8 GB) in Python, the code I'm using looks roughly like this:
files = {'md5': ('', md5hash),
'modified': ('', now),
'created': ('', now),
'file': (os.path.basename(url), fileobject, 'application/octet-stream')}
m = requests_toolbelt.MultipartEncoder(fields=files)
headers['content-type'] = m.content_type
r = s.post(url, data=m, params=params, headers=headers)
When I run this code, everything else on my network just stops working. Websites stop working, etc. My guess would be that Python is saturating my router's TCP packet buffer or something. This behavior will be well known to anyone who has tried running a BitTorrent client without specifying limits on their upload speed.
Is there any way I can throttle the upload speed so my users won't have their network destroyed?