3

I'm looking to give Cloud Build access to a PostgreSQL database during the steps because it's part of an integration testing from the Python application I'm running. Any suggestions on how to handle this authorization without exposing the database to the world?

enter image description here

Community
  • 1
  • 1
aikaSan21
  • 85
  • 1
  • 7

2 Answers2

1

You can do this using a Private Pool where you define the network CIDR to be used at build time; see https://cloud.google.com/build/docs/private-pools/private-pools-overview to learn more.


(Previous answer follows, which I've left in place for transparency around history.)

At this time, you would need to whitelist all of the GCE public IP address ranges -- which effectively exposes your database to the world. (So don't do that!)

However, at Google Next we announced and demoed a coming Alpha release that will enable you to run GCB workloads in a hybrid VPC world with access to protected (on-prem) resources. As part of that Alpha, you could whitelist internal-only addresses to achieve your goal securely.

You can watch for a public announcement in our release notes.

David Bendory
  • 1,258
  • 8
  • 14
  • do you thin is possible to use terraform to know the cloud build ip range as for other services https://github.com/terraform-providers/terraform-provider-google/blob/master/google/data_source_google_netblock_ip_ranges_test.go – c4f4t0r Nov 03 '19 at 21:19
0

Now you can use IAP (Identity-Aware Proxy) TCP forwarding feature.

I don't know if this is still helpful or not but I run into a similar situation a while ago and I was able fix it like this.

steps:
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
  entrypoint: /bin/sh
  args:
  - '-c'
  - |
    gcloud compute start-iap-tunnel sql-vm 5555 \
      --local-host-port=localhost:5555 \
      --zone=us-west1-a & sleep 5 && python echo_client.py

I also wrote a blog post about this. Check it here hodo.dev