0

I am trying to write a Python script that changes the storage class of an AWS S3 object from GLACIER or DEEP_ARCHIVE to STANDARD. The reverse is fairly easy to implement in a synchronous manner using the s3_client.copy() method and specifying the new storage class to, say, DEEP_ARCHIVE.

I understand that recovering from these "deep" classes requires the submission of a recovery request and after some time the data becomes available. That's perfectly clear.

However, I'm not able to make it work by following the official boto3 restore_object documentation.

Here's my script:

session = boto3.Session('...')
s3_resource = session.resource('s3')

obj = s3_resource.Object('my-bucket-name', 'file.txt')

response = obj.restore_object(
    RestoreRequest={
        'Tier': 'Expedited',
        'OutputLocation': {
            'S3': {
                'BucketName': 'my-bucket-name',
                'Prefix': 'file.txt',
                'StorageClass': 'STANDARD'
            }
        }
    }
)

The error I keep getting is:

botocore.exceptions.ClientError: An error occurred (MalformedXML) when calling the RestoreObject operation: The XML you provided was not well-formed or did not validate against our published schema

Interestingly, using the days argument works fine, however, this is not what I want to achieve because neither will it permanently restore the data nor will the specific key actually switch to STANDARD class.

Having studied the aforementioned documentation, it is strongly suggested that what I want to achieve is supported.

This guy's code at line 97 is similar to what I need but restores to a different bucket -- not sure if that's what makes the difference, assuming his code still works.

Sotosoul
  • 1
  • 2
  • 1
    `OutputLocation` is only used to store the output of a S3 Select query. In other words, really want you want to do is use the `days` argument, then _after_ the restore is complete, perform an S3 copy (you can copy to/from the same bucket and key), which will generate a new object in the desired non-glacier storage class. – Anon Coward Jul 18 '23 at 02:24
  • Related: [Unable to restore Glacier deep archive to different S3 bucket - Stack Overflow](https://stackoverflow.com/questions/63878940/unable-to-restore-glacier-deep-archive-to-different-s3-bucket) – John Rotenstein Jul 18 '23 at 02:44
  • I see. Unfortunately, I'm not able to deduce from the `restore_object` documentation that the only way to achieve what I want to achieve is by doing what you wrote, @AnonCoward. I don't want to restore data using the "days" approach, however, it looks like I have to. Thanks for your message :) – Sotosoul Jul 18 '23 at 10:59

0 Answers0