0

Despite using depends_on directive, it looks like zip is not created before trying to put it in the bucket. Considering pipeline output, somehow it just omits archiving the file before firing upload to bucket. Both files (index.js and package.json) exists.

resource "google_storage_bucket" "cloud-functions" {
  project       = var.project-1-id
  name          = "${var.project-1-id}-cloud-functions"
  location      = var.project-1-region
}

resource "google_storage_bucket_object" "start_instance" {
  name       = "start_instance.zip"
  bucket     = google_storage_bucket.cloud-functions.name
  source     = "${path.module}/start_instance.zip"
  depends_on = [
    data.archive_file.start_instance,
  ]
}

data "archive_file" "start_instance" {
  type        = "zip"
  output_path = "${path.module}/start_instance.zip"

  source {
    content  = file("${path.module}/scripts/start_instance/index.js")
    filename = "index.js"
  }

  source {
    content  = file("${path.module}/scripts/start_instance/package.json")
    filename = "package.json"
  }
}
Terraform has been successfully initialized!
 $ terraform apply -input=false "planfile"
 google_storage_bucket_object.stop_instance: Creating...
 google_storage_bucket_object.start_instance: Creating...
 Error: open ./start_instance.zip: no such file or directory
   on cloud_functions.tf line 41, in resource "google_storage_bucket_object" "start_instance":
   41: resource "google_storage_bucket_object" "start_instance" {

LOGS:

 2020-11-18T13:02:56.796Z [DEBUG] plugin.terraform-provider-google_v3.40.0_x5: 2020/11/18 13:02:56 [WARN] Failed to read source file "./start_instance.zip". Cannot compute md5 hash for it.
 2020/11/18 13:02:56 [WARN] Provider "registry.terraform.io/hashicorp/google" produced an invalid plan for google_storage_bucket_object.stop_instance, but we are tolerating it because it is using the legacy plugin SDK.
     The following problems may be the cause of any confusing errors from downstream operations:
       - .detect_md5hash: planned value cty.StringVal("different hash") does not match config value cty.NullVal(cty.String)
 2020/11/18 13:02:56 [WARN] Provider "registry.terraform.io/hashicorp/google" produced an invalid plan for google_storage_bucket_object.start_instance, but we are tolerating it because it is using the legacy plugin SDK.
     The following problems may be the cause of any confusing errors from downstream operations:
       - .detect_md5hash: planned value cty.StringVal("different hash") does not match config value cty.NullVal(cty.String)
wujt
  • 1,278
  • 2
  • 13
  • 21
  • 1
    Please set the variable TF_LOG to DEBUG and capture the output on TF_LOG_PATH (see https://www.terraform.io/docs/internals/debugging.html) – Iñigo González Nov 18 '20 at 12:21
  • @Iñigo I get there `[WARN] ReferenceTransformer: reference not found: "data.archive_file.start_instance#destroy"` – wujt Nov 18 '20 at 13:08
  • I have added some logs output above. – wujt Nov 18 '20 at 13:12
  • What is your terraform version? There were issues with Version 0.11.11 – Fedor Petrov Nov 18 '20 at 14:23
  • @FedorPetrov 0.13.0 – wujt Nov 18 '20 at 14:51
  • 1
    Please have a look into the [Terraform Official Documentation](https://registry.terraform.io/providers/hashicorp/archive/latest/docs/data-sources/archive_file#output_path). Also please see this [GitHub issue](https://github.com/hashicorp/terraform-provider-google/issues/1938#issuecomment-650304757) and [Stackoverflow post](https://stackoverflow.com/questions/56916719/is-there-a-way-to-define-multiple-source-file-for-terraform-archive-provider) which may help you. – Nibrass H Nov 25 '20 at 15:46

1 Answers1

2

I have exactly the same issue with GitLab CI/CD pipeline. After some digging, according to the discussion I found out that with this setup, the plan and apply stages are run in separate containers, and the archiving step is executed in the plan stage.

A workaround is to create a dummy trigger with null_resource and force the archive_file to depend on it, and, hence, to be executed in the apply stage.

resource null_resource dummy_trigger {
  triggers = {
    timestamp = timestamp()
  }
}

resource "google_storage_bucket" "cloud-functions" {
  project       = var.project-1-id
  name          = "${var.project-1-id}-cloud-functions"
  location      = var.project-1-region
}

resource "google_storage_bucket_object" "start_instance" {
  name       = "start_instance.zip"
  bucket     = google_storage_bucket.cloud-functions.name
  source     = "${path.module}/start_instance.zip"
  depends_on = [
    data.archive_file.start_instance,
  ]
}

data "archive_file" "start_instance" {
  type        = "zip"
  output_path = "${path.module}/start_instance.zip"

  source {
    content  = file("${path.module}/scripts/start_instance/index.js")
    filename = "index.js"
  }

  source {
    content  = file("${path.module}/scripts/start_instance/package.json")
    filename = "package.json"
  }
  
  depends_on = [
    resource.null_resource.dummy_trigger,
  ]
}