0

Possible Duplicate:
Python: HTTP Post a large file with streaming

I'm writing a program that uploads large amounts of data over http (specifically to Amazon Glacier but that's irrelevant), and I'm looking at ways to reduce the memory overhead.

The current situation is basically: - read part of file in memory, - upload file to server.

The problem is, part is large, up to 4096 MB, and that's simply a waste of memory to store it all in RAM. I'm looking for a way to cut down the memory to no more than 1 MB.

I have been looking at

HTTPConnection.request(method, url[, body[, headers]])

where body may be an open file (no need to copy this to memory; just read from disk and it's fine): the problem is that I do NOT want to send a complete file in one go, rather arbitrary parts of that file. However short of creating a new file with just that part of data, I don't know how to handle this.

Community
  • 1
  • 1
Wouter
  • 2,623
  • 4
  • 34
  • 43
  • Have you looked into some 3rd party application like tornado? – karthikr Oct 08 '12 at 14:48
  • it is possible that HTTPConnection.send may be called several times (for sending several parts of the same file (http://stackoverflow.com/questions/5093622/multiple-send-on-httplib-httpconnection-and-multiple-read-on-httpresponse ) – njzk2 Oct 08 '12 at 15:24
  • I hope to use built-in stuff from Python. – Wouter Oct 08 '12 at 18:14
  • Related: [WSGI file streaming with a generator](http://stackoverflow.com/questions/11811404/) – Piotr Dobrogost Oct 10 '12 at 22:20

1 Answers1

0

According to this http://bugs.python.org/issue12319 urllib2 supports chunk encoding for POST. I guess add_data can therefore be called more than once.

njzk2
  • 38,969
  • 7
  • 69
  • 107