I am trying to write a Python script that changes the storage class of an AWS S3 object from GLACIER
or DEEP_ARCHIVE
to STANDARD
. The reverse is fairly easy to implement in a synchronous manner using the s3_client.copy()
method and specifying the new storage class to, say, DEEP_ARCHIVE
.
I understand that recovering from these "deep" classes requires the submission of a recovery request and after some time the data becomes available. That's perfectly clear.
However, I'm not able to make it work by following the official boto3 restore_object
documentation.
Here's my script:
session = boto3.Session('...')
s3_resource = session.resource('s3')
obj = s3_resource.Object('my-bucket-name', 'file.txt')
response = obj.restore_object(
RestoreRequest={
'Tier': 'Expedited',
'OutputLocation': {
'S3': {
'BucketName': 'my-bucket-name',
'Prefix': 'file.txt',
'StorageClass': 'STANDARD'
}
}
}
)
The error I keep getting is:
botocore.exceptions.ClientError: An error occurred (MalformedXML) when calling the RestoreObject operation: The XML you provided was not well-formed or did not validate against our published schema
Interestingly, using the days
argument works fine, however, this is not what I want to achieve because neither will it permanently restore the data nor will the specific key
actually switch to STANDARD
class.
Having studied the aforementioned documentation, it is strongly suggested that what I want to achieve is supported.
This guy's code at line 97 is similar to what I need but restores to a different bucket -- not sure if that's what makes the difference, assuming his code still works.