7

I am trying to setup a brand new pipeline with the last version of AWS CDK for typescript (1.128).

The creation of the pipeline is pretty straight forward. I have added sources and build stages with no issues. The objective here is to have an automatic deployment of a static landing page.

So far I have this piece of code:

        const landingPageStep = new ShellStep(`${PREFIX}LandingPageCodeBuildStep`, {
            input: CodePipelineSource.connection(`${GIT_ORG}/vicinialandingpage`, GIT_MAIN, {
                connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
            }),
            installCommands: [
                'npm ci',
            ],
            commands: [
                'npm run build',
            ],
            primaryOutputDirectory: 'out',
        })

        const pipeline = new CodePipeline(this, `${PREFIX}Pipeline`, {
            pipelineName: `${PREFIX}Pipeline`,
            synth: new ShellStep(`${PREFIX}Synth`, {
                input: CodePipelineSource.connection(`${GIT_ORG}/viciniacdk`, GIT_MAIN, {
                    connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
                    }),
                commands: [
                    'npm ci',
                    'npm run build',
                    'npx cdk synth',
                ],
                additionalInputs: {
                    'landing_page': landingPageStep,
                },
            }),
        });

The step I am not sure how to achieve it is how to deploy to S3 using the output of "landing_page". With previous versions of Pipelines there was a heavy use of Artifacts objects and CodePipelineActions, something similar to this where sourceOutput is an Artifact object:

    const targetBucket = new s3.Bucket(this, 'MyBucket', {});

    const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
    const deployAction = new codepipeline_actions.S3DeployAction({
        actionName: 'S3Deploy',
        stage: deployStage,
        bucket: targetBucket,
        input: sourceOutput,
    });
    const deployStage = pipeline.addStage({
        stageName: 'Deploy',
        actions: [deployAction],
    });

Now it is completely different since you have access to FileSet objects and apparently the build steps are intended to be used nesting outputs as the example above. Every output file is saved in a bucket with ugly file names, so it is not intended to be accessed directly neither.

I have seen some hacky approaches replacing ShellStep by CodeBuildStep and using as a postbuild command in the buildspec.yml file something like this:

aws s3 sync out s3://cicd-codebuild-static-website/

But it is resolved in the build stage and not in a deployment stage where it will be ideal to exist.

I have not seen anything insightful in the documentation so any suggestion is welcome. Thanks!

maafk
  • 6,176
  • 5
  • 35
  • 58
Abend
  • 589
  • 4
  • 14
  • 'Pipelines' (as opposed to 'CodePipelines') is still experimental and may not have all the available options and or finished with them - have you looked through the documentation and seen anything refering to s3_deployment for static landing pages? – lynkfox Oct 21 '21 at 16:50
  • 1
    @lynkfox Yeah, I have read the docs several times and there is no reference (or at least I haven't find it) to S3Deployment using this kind of pipelines. Currently the documentation is a bit messy because coexist Pipeline, CdkPipeline and CodePipeline classes within the same module. – Abend Oct 21 '21 at 17:30
  • Pipelines (that is to say this one https://docs.aws.amazon.com/cdk/api/latest/docs/pipelines-readme.html `@aws-cdk/pipelines`) is a completely different module than CodePipelines (`@aws-cdk/codepipelines`) the second of which contains CodePipeline and is linked to CodePipelineActions. There is no module called 'CDKpipelines' which one of the above are you trying to use? You said the "new" module so I assumed Pipelines as that is the new cdk pipeline module (with self mutating pipelines and quick synth stages) but perhaps I was mistaken – lynkfox Oct 21 '21 at 22:00
  • 1
    @lynkfox It is confusing since AWS messed up with this. `@aws-cdk/codepipelines` module defines the `Pipeline` class which is a low level implementation. `@aws-cdk/pipelines` module defines two high level classes: `CdkPipeline` (that is the old API) and `CodePipeline` (that is the new API). I am trying to use `CodePipeline` class from the `@aws-cdk/pipelines` module. – Abend Oct 22 '21 at 00:07
  • @lynkfox CDK pipelines is GA, it is not experimental anymore. – gshpychka Oct 25 '21 at 11:20
  • oh really? wow - recent? lol last i checked, a few weeks (months?) ago it wasnt yet but the cdk is always being impoved so yay – lynkfox Oct 25 '21 at 12:57
  • @lynkfox End of July: https://aws.amazon.com/about-aws/whats-new/2021/07/announcing-cdk-pipelines-ga-ci-cd-cdk-apps/ – gshpychka Oct 27 '21 at 10:40
  • my how the time flies. Thanks! Ive updated a couple of my answers on different questions because of this new input. – lynkfox Oct 27 '21 at 14:13
  • 2
    I'm trying to get this to work too. I can't use the `BucketDeployment` approach suggested below, since that requires a `Source` (which can only be from local assets or from another Bucket), and what I'm trying to upload must be the output of a CodeBuild step (specifically, the result of running [hugo](https://gohugo.io/) on _another_ repo's content). So I need a way to upload a FileSet to S3. `S3DeployAction` won't work, since the `input` requires an `Artifact`, not a `FileSet`. – scubbo Nov 11 '21 at 07:17
  • 1
    @scubbo you can use `options.artifacts.toCodePipeline(this.input)` where `this.input` is a `Fileset`. Basically this trnasforms from `Fileset` to `Artifact` – Abend Nov 11 '21 at 13:55
  • @Abend What did you end up going with? I am interested in this as well – nsquires Nov 12 '21 at 00:02
  • @Abend worked like a charm, thank you! – scubbo Nov 12 '21 at 07:22

1 Answers1

3

You can extend Step and implement ICodePipelineActionFactory. It's an interface that gets codepipeline.IStage and adds whatever actions you need to add.

Once you have the factory step, you pass it as either pre or post options of the addStage() method option.

Something close to the following should work:

class S3DeployStep extends Step implements ICodePipelineActionFactory {
  constructor(private readonly provider: codepipeline_actions.JenkinsProvider, private readonly input: FileSet) {
  }

  public produceAction(stage: codepipeline.IStage, options: ProduceActionOptions): CodePipelineActionFactoryResult {

    stage.addAction(new codepipeline_actions.S3DeployAction({
        actionName: 'S3Deploy',
        stage: deployStage,
        bucket: targetBucket,
        input: sourceOutput,
        runOrder: options.runOrder,
    }));

    return { runOrdersConsumed: 1 };
  }
}

// ...

pipeline.addStage(stage, {post: [new S3DeployStep()]});

But a way way way simpler method would be to use BucketDeployment to do it as part of the stack deployment. It creates a custom resource that copies data to a bucket from your assets or from another bucket. It won't get its own step in the pipeline and it will create a Lambda function under the hood, but it's simpler to use.

kichik
  • 33,220
  • 7
  • 94
  • 114
  • 2
    `BucketDeployment` is nicer I agree since you can even also invalidate Cloudfront Cache for example, but you can not use artifacts/fileset as input (or at least I could not do that). It expects the code in the same CDK package or in a bucket, which turn it into something not too useful. The solution you provided worked like a charm, I suggest you edit the answer in order to include the `runOrder` parameter took from `options.runOrder`. If you do not do that the step omits the `pre`/`post` statements. I struggled with this haha. Thank you for your answer! – Abend Oct 26 '21 at 02:35
  • 3
    May I ask where `sourceOutput` is coming from? With `Pipeline`, you can configure the output as an `Artifact`, but with `CodePipeline`, it is all implicit. – Yanfeng Liu Jan 11 '22 at 02:58