I'm implementing a simple upload handler in Python which reads an uploaded file in chunks into memory, GZips and signs them, and reuploads them to another server for long term storage. I've already devised a way to read the upload in chunks with my web server, and essentially I have a workflow like this:
class MyUploadHandler:
def on_file_started(self, file_name):
pass
def on_file_chunk(self, chunk):
pass
def on_file_finished(self, file_size):
pass
This part works great.
Now I need to upload the file in chunks to the final destination after performing my modifications to them. I'm looking for a workflow somewhat like this:
import requests
class MyUploadHandler:
def on_file_started(self, file_name):
self.request = requests.put("http://secondaryuploadlocation.com/upload/%s" %
(file_name,), streaming_upload = True)
def on_file_chunk(self, chunk):
self.request.write_body(transform_chunk(chunk))
def on_file_finished(self, file_size):
self.request.finish()
Is there a way to do this using the Python requests library? It seems that they allow for file-like upload objects which can be read, but I'm not sure exactly what that means and how to apply it for my situation. How can I provide a streaming upload request like this?