How should errors be handled/tested for python AWS boto3 s3 put_object? For example:
import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('foo')
bucket.put_object(Key='bar', Body='foobar')
Are the errors that can arise documented somewhere? Is the following even the right documentation page (it seems to be for boto3.client('s3')
client, not boto3.resource('s3')
), and if so where are the errors documented?
http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object
Simple errors like a non-existent bucket seems easy enough to test, but can spurious errors occur and if so how can that kind of error handling be tested? Are there limits to the upload rate? I tried the following and was surprised to see all 10000 files successfully created after about 2 minutes of running. Does s3 block as opposed to error when some rate is exceeded?
from concurrent.futures import ThreadPoolExecutor
import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('foo')
def put(i):
bucket.put_object(Key='bar/%d' % i, Body='foobar')
executor = ThreadPoolExecutor(max_workers=1024)
for i in range(10000):
executor.submit(put, i)
Is it good practice to retry the put_object
call 1 or more times if some error occurs?