2

How should errors be handled/tested for python AWS boto3 s3 put_object? For example:

import boto3

s3 = boto3.resource('s3')
bucket = s3.Bucket('foo')
bucket.put_object(Key='bar', Body='foobar')

Are the errors that can arise documented somewhere? Is the following even the right documentation page (it seems to be for boto3.client('s3') client, not boto3.resource('s3')), and if so where are the errors documented?

http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object

Simple errors like a non-existent bucket seems easy enough to test, but can spurious errors occur and if so how can that kind of error handling be tested? Are there limits to the upload rate? I tried the following and was surprised to see all 10000 files successfully created after about 2 minutes of running. Does s3 block as opposed to error when some rate is exceeded?

from concurrent.futures import ThreadPoolExecutor

import boto3

s3 = boto3.resource('s3')
bucket = s3.Bucket('foo')

def put(i):
    bucket.put_object(Key='bar/%d' % i, Body='foobar')

executor = ThreadPoolExecutor(max_workers=1024)

for i in range(10000):
    executor.submit(put, i)

Is it good practice to retry the put_object call 1 or more times if some error occurs?

JDiMatteo
  • 12,022
  • 5
  • 54
  • 65
  • Why not using `try: except`? in such case? – mootmoot Aug 24 '16 at 08:21
  • Answers like the following suggest that there is some automatic retry going on: http://stackoverflow.com/questions/29378763/how-to-save-s3-object-to-a-file-using-boto3/35367531#35367531 . Also, some boto3 errors don't raise exceptions, e.g. lambda invoke response status code contains error info http://boto3.readthedocs.io/en/latest/reference/services/lambda.html#Lambda.Client.invoke – JDiMatteo Aug 24 '16 at 08:27
  • 1
    Depends on upload frequency. You may want verify the upload using `GET` before retry using `PUT` request. Since AWS charge 10 times more for `PUT`($0.01/1k requests) than `GET` ($0.01/10k requests) – mootmoot Aug 24 '16 at 09:00

1 Answers1

0

AWS s3 does not restrict uploads based on requests. The restriction is only for size: For Example: 1 POST request will upload files upto 5GB 2 PUT can upload upto 160 GB of size

The errors you are trying or expecting to handle are nothing but client/browser restriction while uploading multiple files at a time.

Boto3 Upload interface do have parameter called 'config', in which u can specify concurrent uploads: # To consume less downstream bandwidth, decrease the maximum concurrency config = TransferConfig(max_concurrency=5)

Varun Singh
  • 444
  • 6
  • 21