1

I'm trying to import an existing S3 bucket into a newly created CloudFormation stack. As a reference, I'm using this site. I use a Github workflow runner to execute this, like so:

      - name: Add existing S3 bucket and object to Stack
        run:  aws cloudformation create-change-set
          --stack-name ${{ env.STACK_NAME }} --change-set-name ImportChangeSet
          --change-set-type IMPORT
          --resources-to-import file://ResourcesToImport.txt
          --template-url https://cf-templates.s3.eu-central-1.amazonaws.com/ResourcesToImport.yaml

I'm a little confused to what exactly should the ResourcesToImport.txt and ResourcesToImport.yaml contain. I currently have:

ResourcesToImport.txt

    [
      {
          "ResourceType":"AWS::S3::Bucket",
          "LogicalResourceId":"myBucket",
          "ResourceIdentifier": {
            "resourceName":"myBucket",
            "resourceType":"AWS::S3::Bucket"
          }
      }
    ]

NB: As a sidenote, I have just used the bucket name, but actually I just want a specific folder within that bucket.

ResourcesToImport.yaml

    AWSTemplateFormatVersion: '2010-09-09'
    Description: Import existing resources
    
    Resources:
      S3SourceBucket:
        Type: AWS::S3::Bucket
        DeletionPolicy: Retain
        BucketName: myBucket

I'm quite sure the replication of information in both of these files is redundant and incorrect. The ResourcesToImport.yaml file is uploaded in advance to the bucket cf-templates/ResourcesToImport.yaml

What should these two files actually contain, if I am to import only an existing S3 bucket and folder?

EDIT

In addition to the template route, I also tried adding the S3 bucket via the console. However when the S3 url is added (s3://myBucket/folder1/folder2/), I get:

S3 error: Domain name specified in myBucket is not a valid S3 domain
fedonev
  • 20,327
  • 2
  • 25
  • 34
pymat
  • 1,090
  • 1
  • 23
  • 45

1 Answers1

0

Here's what the two file inputs to create-change-set should contain when importing:

--resources-to-import The resources to import into your stack. This identifies the to-be-imported resources. Not a template. Make sure the LogicalResourceId matches the resource id in the template below. In your case: "LogicalResourceId": "S3SourceBucket".

--template-url The [S3] location of the file that contains the revised template. This is a CloudFormation template that includes (a) the to-be-imported resources AND (b) the existing stack resources. This is what CloudFormation will deploy when you execute the change set. Note: alternatively, use --template-body with a local file template instead.

Regarding your EDIT:

Bucket names cannot contain slashes. Object Keys can. S3 does not have folders per se, although object keys with a / have some folder-like properties. The path/to/my.json together is the S3 object Key name:

Amazon S3 supports buckets and objects, and there is no hierarchy. However, by using prefixes and delimiters in an object key name, the Amazon S3 console and the AWS SDKs can infer hierarchy and introduce the concept of folders

fedonev
  • 20,327
  • 2
  • 25
  • 34