1

I'm trying to upload 160 Gb file from ec2 to s3 using

s3cmd put --continue-put FILE s3://bucket/FILE

but every time uploading interrupts with the message:

FILE -> s3://bucket/FILE [part 10001 of 10538, 15MB] 8192 of 15728640 0% in 1s 6.01 kB/s  failed

ERROR: Upload of 'FILE' part 10001 failed. Aborting multipart upload.
ERROR: Upload of 'FILE' failed too many times. Skipping that file.

The target bucket does exist.

What is the issue's reason?

Are there any other ways to upload the file?

Thanks.

hdf
  • 155
  • 3
  • 11

2 Answers2

2

You can have up to 10000 upload parts per object, so it fails on part 10001. Using larger parts may solve the issue.

Julio Faerman
  • 13,228
  • 9
  • 57
  • 75
0

"huge"---is it 10s or 100s of GBs? s3 limits the object size to 5GB and uploading may fail if it exceeds the size limitation.

ericzma
  • 763
  • 3
  • 9
  • 23
  • the file has 160 GB size. I had no problems with uploading up to 80 GB files – hdf Apr 27 '14 at 17:51
  • Ahh... My outdated info about S3.. They support up to 5TB now: http://aws.typepad.com/aws/2010/12/amazon-s3-object-size-limit.html – ericzma Apr 28 '14 at 06:12