I have a terraform script that, after terraform apply
, successfully launches an AWS spot instance and then runs a bash script. After the script finishes running and the creation is complete, I have been manually destroying the spot instance with terraform destroy
. This is inconvenient, because I either have to watch my email for a CloudWatch alert or periodically check-in on the progress of the script. Ideally, I would be able to automatically destroy the AWS resources I created automatically. Does anyone know how I should go about doing this? Am I using the wrong AWS resources, i.e. should I be using ECS?
Asked
Active
Viewed 4,077 times
3

wherestheforce
- 527
- 1
- 5
- 19
-
I guess AWS ECS Run Task or AWS Batch seems to be suitable. – minamijoyo Apr 25 '17 at 15:35
-
How about aws lambda, if you can convert the bash script to python or other supported languages. – BMW Apr 26 '17 at 06:26
-
@minamijoyo I know AWS Batch is not yet implemented into Terraform https://github.com/hashicorp/terraform/issues/12187, and I'm not sure about ECS Run Task. I would prefer to use Terraform for the infrastructure as code benefits. – wherestheforce Apr 26 '17 at 15:22
-
@BMW I think I need bash. – wherestheforce Apr 26 '17 at 15:24
-
1ECS Run Task API runs one-shot task with docker container. However you need manage ECS container instances. Terraform defines resources statically. So if you want to destroy resources, some job control is required anyway. – minamijoyo Apr 27 '17 at 10:45
2 Answers
3
The solution I found is to create a null resource and then include the following provisioner after running my script.
provisioner "remote-exec" {
inline = [
"sudo shutdown -h now",
]
}

wherestheforce
- 527
- 1
- 5
- 19
1
You can create one lambda function and then call your shell script with in lambda.
You can schedule it with the help of cloudwatch, terminate on completion and can apply monitoring on it.
How to : Can bash script be written inside a AWS Lambda function

Shubham Bansal
- 391
- 1
- 9