I am currently struggling to get the user_data script to run when starting the EC2 instance using Terraform. I pre-configured my AMI using Packer, and referenced the custom AMI in my Terraform file. Since I need to now the RDS instance URL when starting the EC2 instance, I tried to read them inside the user_data script and set them as environment variables. My app tries to read these environment variables and can connect to the db. Everything works as expected locally, and on CI when running tests. Manually setting the variables and starting the app also works as expected. The only problem is the execution of the user_data script because it already ran when creating the AMI using Packer.
Notice how I read the current DB state inside Terraform, which is why I cannot use traditional approaches that would result in the user_data script getting executed again. I also tried deleting the cloud data as described in this question and this one without success.
This is my current Packer configuration:
build {
name = "spring-ubuntu"
sources = [
"source.amazon-ebs.ubuntu"
]
provisioner "file" {
source = "build/libs/App.jar"
destination = "~/App.jar"
}
provisioner "shell" {
inline = [
"sleep 30",
"sudo apt update",
"sudo apt -y install openjdk-17-jdk",
"sudo rm -Rf /var/lib/cloud/data/scripts",
"sudo rm -Rf /var/lib/cloud/scripts/per-instance",
"sudo rm -Rf /var/lib/cloud/data/user-data*",
"sudo rm -Rf /var/lib/cloud/instances/*",
"sudo rm -Rf /var/lib/cloud/instance",
]
}
}
This is my current Terraform configuration:
resource "aws_instance" "instance" {
ami = "ami-123abc"
instance_type = "t2.micro"
subnet_id = tolist(data.aws_subnet_ids.all.ids)[0]
vpc_security_group_ids = [aws_security_group.ec2.id]
user_data = <<EOF
#!/bin/bash
export DB_HOST=${data.terraform_remote_state.state.outputs.db_address}
export DB_PORT=${data.terraform_remote_state.state.outputs.db_port}
java -jar ~/App.jar
EOF
lifecycle {
create_before_destroy = true
}
}