6

I was trying to copy all the files from my S3 bucket to a local folder in VM and I am getting the following error:

warning: Skipping file s3://bucket/object. Object is of storage class GLACIER.
Unable to perform download operations on GLACIER objects. You must restore the
object to be able to perform the operation. See aws s3 download help for
additional parameter options to ignore or force these transfers.

Glacier error

To copy files from my S3 bucket to local folder I used the following command:

aws s3 cp s3://${s3Location} ${localDumpPath}

Where:

  • ${s3Location} = my s3 location and
  • ${localDumpPath} = my localfolder path

What do I need to change to be able to copy successfully?

Maxim Masiutin
  • 3,991
  • 4
  • 55
  • 72
snehab
  • 684
  • 5
  • 13
  • Objects stored in a `GLACIER` storage class are not immediately accessible. This is a trade-off for having a lower storage price. Therefore, you need to 'trigger' the Restore process. This can be done with the [restore-object](https://docs.aws.amazon.com/cli/latest/reference/s3api/restore-object.html) command. However, @snehab's suggestion will presumably incorporate the restore into the copy process. (But I wonder how long it would take to execute?) – John Rotenstein Jul 03 '20 at 07:21

3 Answers3

15

I fixed the issue by using the following command:

aws s3 cp s3://${s3Location} ${localDumpPath} --storage-class STANDARD --recursive --force-glacier-transfer

You can also refer the below link to get details of how to restore an S3 object from the Amazon S3 Glacier storage class using the AWS CLI: Restore S3 object from the Amazon Glacier storage class

snehab
  • 684
  • 5
  • 13
  • 5
    Getting this error when i try it: `An error occurred (InvalidObjectState) when calling the GetObject operation: The operation is not valid for the object's storage class` :( – CpILL May 13 '21 at 21:47
  • --force-glacier-transfer is the non-intuitive part. Thanks! – N R Jul 24 '23 at 19:30
7

The Problem: you are trying to copy an aws s3 object but the storage type is glacier and you got the following error:

warning: Skipping file s3://<SomePathToS3Object> Object is of storage class GLACIER.
Unable to perform download operations on GLACIER objects.
You must restore the object to be able to perform the operation.
See aws s3 download help for additional parameter options to ignore or force these transfers.

Explanation: Amazon S3 Glacier is a secure, durable, and extremely low-cost cloud storage service for data archiving and long-term backup. When you need to use the file you perform a restore request ,you pay a retrieval pricing and after couple of hours the object is enabled and ready. This feature usually use companies to archive files/logs/databases/backups when this data is rarely consumed.

Solution: In order to get glacier files you need to initiate a restore request , monitor the status of the restore request, as soon as it finishes change the object of the storage class (standard) and copy it. You can use aws reference

//Initate restore request:
$ aws s3api restore-object --bucket examplebucket --key dir1/example.obj \
--restore-request '{"Days":7,"GlacierJobParameters":{"Tier":"Standard"}}'

//monitor status:
$ aws s3api head-object --bucket examplebucket --key dir1/example.obj

// output example - restore in progress
{
    "Restore": "ongoing-request=\"true\"",
    ...
    "StorageClass": "GLACIER",
    "Metadata": {}
}

// output example - restore completed
 {
    "Restore": "ongoing-request=\"false\", expiry-date=\"Sun, 1 January 2000 00:00:00 GMT\"",
    ...
    "StorageClass": "GLACIER",
    "Metadata": {}
}

$ aws s3 cp s3://examplebucket/dir1/ ~/Downloads \
--storage-class STANDARD --recursive --force-glacier-transfer
avivamg
  • 12,197
  • 3
  • 67
  • 61
  • Getting this error when trying `An error occurred (AccessDenied) when calling the RestoreObject operation: Access Denied` – Dan Li Dec 21 '22 at 15:47
3

You wrote that you needed "to copy all the files" to a local folder, assuming you wanted to copy the files recursively.

Because the files are kept in the Glacier storage class, you need to restore them from the Glacier archive first before you could copy them to your local folder, i.e. make the files available for retrieval for a specified number of days. After the restore completes, you can copy the files specifying the --force-glacier-transfer parameter until the term that you have specified in days expire.

Unless you store the files in the "S3 Glacier Instant Retrieval" storage class, you should first restore the files (make them available for retrieval) so that --force-glacier-transfer option would not fail. Therefore, the solution proposed at https://stackoverflow.com/a/62651252/6910868 does not apply to "S3 Glacier Deep Archive" storage class, for which you have to explicitly issue the restore-object command and wait for its completion before you can copy files to your local folder.

However, the aws s3api restore-object restores just one file and does not support recursive restore. The solution specified at https://stackoverflow.com/a/65925266/6910868 does not work for a recursive directory or when you have multiple files so that you wish to specify just the folder without listing all the files one by one.

As an alternative, instead of restoring the files by making them available for retrieval, you can change the object's storage class to Amazon S3 Standard. To do that, you can copy the files within S3, either by overwriting the existing files or by copying the files from one S3 location into another S3 location. In each case, you should specify the correct destination storage class.

If you just need to retrieve recursive files from the Glacier storage class without changing the storage class or making additional copies within S3, you can use the Perl script that lists the files recursively and then restores them from Glacier individually. This script may be used not only to initiate the restore using the specified restore tier, but to monitor the process as well.

Maxim Masiutin
  • 3,991
  • 4
  • 55
  • 72