3

I would like to pass my Google Cloud Platform's service account JSON credentials file to a docker container so that the container can access a cloud storage bucket. So far I tried to pass the file as an environment parameter on the run command like this:

  • Using the --env flag: docker run -p 8501:8501 --env GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name
  • Using the -e flag and even exporting the same env variable in the command line: docker run -p 8501:8501 -e GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name

But nothing worked, and I always get the following error when running the docker container:

W external/org_tensorflow/tensorflow/core/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed, returning an empty token. Retrieving token from files failed with "Not found: Could not locate the credentials file.".

How to pass the google credentials file to a container running locally on my personal laptop?

  • If you are running on Compute Engine, use a volume mount. Then you can specify **GOOGLE_APPLICATION_CREDENTIALS=/volume/mount/path" as a normal environment variable inside your container. – John Hanley May 23 '22 at 00:28
  • Does this answer your question? [Add a file in a docker image](https://stackoverflow.com/questions/56670437/add-a-file-in-a-docker-image) – Martin Zeitler May 23 '22 at 04:09
  • @JohnHanley It rather seems to be the situation, to connect from a local container to GCS in order to run TF2, despite the question doesn't literally state that. – Martin Zeitler May 23 '22 at 04:27
  • @MartinZeitler - Hi Martin, I am not sure what you mean. The only Google service that supports running Docker is Compute Engine. That is why I said, "If you are running on Compute Engine". – John Hanley May 23 '22 at 06:11
  • @JohnHanley The question does not tell where the container runs, while the `docker` command seemingly had been issued in a local shell ...that's why I'd assume this scenario. Ir probably doesn't even matter where it runs, while the task is to add a config file into it. – Martin Zeitler May 23 '22 at 06:28
  • Does my article help? https://medium.com/google-cloud/use-google-cloud-user-credentials-when-testing-containers-locally-acb57cd4e4da – guillaume blaquiere May 23 '22 at 07:20
  • Hi @JohnHanley, like Martin said, Im running the container in my personal laptop. Sorry for forgetting to mention it on my post, which I have just edited accordingly. Thank you – pedro malheiro May 23 '22 at 08:58
  • Docker volume mount work on your laptop as well. – John Hanley May 23 '22 at 18:04

3 Answers3

3

You cannot "pass" an external path, but have to add the JSON into the container.

Martin Zeitler
  • 1
  • 19
  • 155
  • 216
1

I log into gcloud in my local environment then share that json file as a volume in the same location in the container.

Here is great post on how to do it with relevant extract below: Use Google Cloud user credentials when testing containers locally

Login locally

To get your default user credentials on your local environment, you have to use the gcloud SDK. You have 2 commands to get authentication:

gcloud auth login to get authenticated on all subsequent gcloud commands gcloud auth application-default login to create your ADC locally, in a “well-known” location.

Note location of credentials

The Google auth library tries to get a valid credentials by performing checks in this order

Look at the environment variable GOOGLE_APPLICATION_CREDENTIALS value. If exists, use it, else… Look at the metadata server (only on Google Cloud Platform). If it returns correct HTTP codes, use it, else… Look at “well-know” location if a user credential JSON file exists The “well-known” locations are

On linux: ~/.config/gcloud/application_default_credentials.json On Windows: %appdata%/gcloud/application_default_credentials.json

Share volume with container

Therefore, you have to run your local docker run command like this

ADC=~/.config/gcloud/application_default_credentials.json \ docker run \
-e GOOGLE_APPLICATION_CREDENTIALS=/tmp/keys/FILE_NAME.json
-v ${ADC}:/tmp/keys/FILE_NAME.json:ro \ <IMAGE_URL>

NB: this is only for local development, on Google Cloud Platform the credentials for the service are automatically inserted for you.

Zaffer
  • 1,290
  • 13
  • 32
0

Two ways to do it:

secrets - work with docker swarm mode.

  • create docker secrets
  • use secret with a container using --secret

Advantage being, secrets are encrypted. Secrets are decrypted when mounted to containers.

gagan
  • 325
  • 3
  • 9