1

I am wondering what would be the best approach for baking an AMI. Although it offers a lot of consistency, it is hard to achieve a level of consistency when you need to re-bake your AMI because of a small security update or new package version because more than likely you will end up updating the other packages you don't need to update and that can cause something to break.

So far I am baking all my package installs including docker and pulling base images (like Ubuntu for example).

I know it is possible to specify exactly what package version you need when you do apt-get install or its cfn-init equivalent, but what if it is no longer supported? Should I put my packages in an S3 bucket? But then what about all the dependencies? Are there any simple ways of doing apt-get install from s3 instead of going out to the 3rd party repo?

alexfvolk
  • 1,810
  • 4
  • 20
  • 40

1 Answers1

1

I just answered a similar question about baking resources into an AMI vs. using a configuration management tool like Chef, Puppet, etc.

Short answer is to try and not bake software into the AMI but rather build on top of base images with repeatable "recipes" (Chef term).

As for the specific versions of packages to install, you certainly can pin software dependencies to specific versions. If you aren't doing anything special with them I would strongly advise to use the native package managers where you can. As for packages not being available anymore, with Ubuntu LTS that hopefully shouldn't be much of an issue.

See the full answer here.

Community
  • 1
  • 1
Mikelax
  • 572
  • 3
  • 10