I'm want to POST a large file from a python client to cherrypy. I'm using the requests library.
This is my client code:
def upload(fileName=None):
url = 'http://localhost:8080/upload'
files = {'myFile': ( fileName, open(fileName, 'rb') )}
r = requests.post(url, files=files)
#with open(fileName,'rb') as payload:
#headers = {'content-type': 'multipart/form-data'}
#r = requests.post('http://127.0.0.1:8080', data=payload,verify=False,headers=headers)
if __name__ == '__main__':
upload(sys.argv[1])
The problem is that this puts the whole file in the RAM memory. Is there any way to POST the file in pieces?
class FileDemo(object):
@cherrypy.expose
def upload(self, myFile):
print myFile.filename
#size = 0
#decoder = MultipartDecoder(myFile, 'image/jpeg')
#for part in decoder.parts:
#print(part.header['content-type'])
#while True:
#advances to the content that hasn't been read
#myFile.file.seek(size, 0)
#reads 100mb at a time so it doesn't fill up the RAM
#data = myFile.file.read(10240000)
#newFile = open("/home/ivo/Desktop/"+str(myFile.filename), 'a+')
#newFile.write(data)
#newFile.close
#size += len(data)
#if len(data) < 10240000:
#break
if __name__ == '__main__':
cherrypy.quickstart(FileDemo())
This is the code in the server side. It has a lot of comments because I've been trying a lot of stuff. Right now I'm just printing the file name and the client still transfers the whole file to RAM.
I don't know what else to try. Thank you in advance for your help.