0

I am doing some batch processing and occasionally end up with a corrupt line of (string) data. I would like to upload these to an S3 file.

Now, I would really want to add all the lines to a single file, and upload it in a after my script finished executing, but my client asked me to use a socket connection instead and add each line one by one as they come up, simulating a single slow upload.

It sounds like he's done this before, but I couldn't find any reference for anything like it (not talking about multi-part uploads). Has anyone done something like this before?

Zac R.
  • 538
  • 3
  • 17
  • 1
    It doesn't look possible. Here's a similar question. https://stackoverflow.com/questions/31031463/can-you-upload-to-s3-using-a-stream-rather-than-a-local-file – kenlukas Aug 02 '18 at 16:13
  • You could use a multi-part upload and provide each _line_ as a _part_. Seems a bit wasteful, since it would involve so many calls to S3. – John Rotenstein Aug 03 '18 at 02:24
  • Thanks. I was aware of the multi-part solution. It's quite easy to do it with `smart_open`, but it's not what I am looking for. Plus, there is a limit of 10000 parts which wouldn't be enough in my case anyway. – Zac R. Aug 04 '18 at 20:10

0 Answers0