26

I am working on a bitbucket pipeline for pushing image to gc container registry. I have created a service account with Storage Admin role. (bitbucket-authorization@mgcp-xxxx.iam.gserviceaccount.com)

enter image description here

gcloud auth activate-service-account --key-file key.json
gcloud config set project mgcp-xxxx
gcloud auth configure-docker --quiet
docker push eu.gcr.io/mgcp-xxxx/image-name

Although that the login is successful, i get: Token exchange failed for project 'mgcp-xxxx'. Caller does not have permission 'storage.buckets.get'. To configure permissions, follow instructions at: https://cloud.google.com/container-registry/docs/access-control

Can anyone advice on what i am missing?

Thanks!

Maxim
  • 4,075
  • 1
  • 14
  • 23
Tania Petsouka
  • 1,388
  • 1
  • 9
  • 20

20 Answers20

29

For anyone reading all the way here. The other suggestions here did not help me, however I found that the Cloud Service Build Account role was also required. Then the storage.buckets.get dissappears.

This is my minimal role (2) setup to push docker images: auomationroles

The Cloud Service Build Account role however adds many more permissions that simply storage.buckets.get. The exact permissions can be found here.

note: I am well aware the Cloud Service Build Account role also adds the storage.objects.get permission. However, adding roles/storage.objectViewerdid not resolve my problem. Regardless of the fact it had the storage.objects.get permission.

If the above does not work you might have the wrong account active. This can be resolved with:

gcloud auth activate-service-account --key-file key.json

If that does not work you might need to set the docker credential helpers with:

gcloud auth configure-docker --project <project_name>

On one final note. There seemed to be some delay between setting a role and it working via the gcloud tool. This was however minimal, think of a scope less than a minute.

Cheers

Shine
  • 676
  • 6
  • 12
  • 1
    loud auth activate-service-account --key-file was the solution for me. Thanks – Jørgen Jul 08 '20 at 08:01
  • 2
    Thanks for this! Not sure if they renamed it, but it's now "Cloud Build Service Account" rather than "Cloud Service Build Account". If you need to set it from the command line or API, the name of the role is `roles/cloudbuild.builds.builder` – Matt Browne Apr 02 '21 at 17:20
  • thanks man, the Cloud Service Build Account role is definitly required, the google doc is missing that – Thomas Ducrot Nov 16 '21 at 20:04
20

You need to be logged into your account and set the project to the project you'd like. There is a good chance you're just not logged in.

gcloud auth login

gcloud config set project <PROJECT_ID_HERE>

Patrick Collins
  • 5,621
  • 3
  • 26
  • 64
  • This fixed my errors. another project was configured as default – uriz May 07 '23 at 17:21
  • This was my issue and I at a juncture where I have trouble differentiating account ID from project ID on GCP. This did fix my issue. Basically, if you change one project to another the `gcloud config...` is required. – Manabu Tokunaga Jun 24 '23 at 16:25
19

In the past I had another service account with same name and different permissions. After discovering that service account names are cached, I created a new service account with different name and it's pushing properly.

Tania Petsouka
  • 1,388
  • 1
  • 9
  • 20
6

These are step-by step commands which got me to push first container to a GCE private repo:

export PROJECT=pacific-shelter-218
export KEY_NAME=key-name1
export KEY_DISPLAY_NAME='My Key Name'

sudo gcloud iam service-accounts create ${KEY_NAME} --display-name ${KEY_DISPLAY_NAME}
sudo gcloud iam service-accounts list
sudo gcloud iam service-accounts keys create --iam-account ${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com key.json
sudo gcloud projects add-iam-policy-binding ${PROJECT} --member serviceAccount:${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com --role roles/storage.admin
sudo docker login -u _json_key -p "$(cat key.json)" https://gcr.io
sudo docker push  gcr.io/pacific-shelter-218/mypcontainer:v2
vitaly goji
  • 256
  • 4
  • 13
  • 1
    thank you! That finally worked for me. I tried creating the service account via the UI, and it would simply not work. I'll never know why. – jpbochi Dec 09 '19 at 11:46
  • I ran into similar problems with the UI. I managed to get it to work via the UI by adding the role: Editor ... but admit this is a rather wrecklessly insecure blunderbus approach. – andrew pate Feb 10 '20 at 18:21
  • Using service account for users is a terrible idea. Users should use user accounts. – Francisco Delmar Kurpiel Apr 17 '20 at 10:18
5

It seems the documentation is outdated https://cloud.google.com/container-registry/docs/access-control

Note: Pushing images requires object read and write permissions as well as the storage.buckets.get permission. The Storage Object Admin role does not include the storage.buckets.get permission, but the Storage Legacy Bucket Writer role does.

But Storage Legacy Bucket Writer Role no longer available.

To fix the permission issue I've added two Roles to the Service Account

  • Storage Admin
  • Storage Object Viewer (it has storage.buckets.get permission)
Pavel Shastov
  • 2,657
  • 1
  • 18
  • 26
4

for anyone else coming across this, my issue was that I had not granted my service account Storage legacy bucket reader. I'd only granted it Object viewer. Adding that legacy permission fixed it.

It seems docker is still using a legacy method to access GCR

Jonas D
  • 258
  • 5
  • 14
3

Here in the future, I've discovered that I no longer have any Legacy options. In this case I was forced to grant full Storage Admin. I'll open a ticket with Google about this, that's a bit extreme to allow me to push an image. This might help someone else from the future.

MXWest
  • 351
  • 2
  • 7
2

GCR just uses GCS to store images check the permissions on your artifacts. folder in GCS within the same project.

Dan
  • 224
  • 1
  • 5
2

Tried several things, but it seems you have to run gcloud auth configure-docker

Cloudkollektiv
  • 11,852
  • 3
  • 44
  • 71
1

add service account role

on google cloud IAM

Editor Storage object Admin Storage object Viewer

fix for me

s4wet
  • 11
  • 2
1

I think the discrepancy is that https://cloud.google.com/container-registry/docs/access-control says, during the #permissions_and_roles section that you need the Storage Admin role in order to push images. However, in the next section that explains how to configure access, it says to add Storage Object Admin to enable push access for the account you're wishing to configure. Switching to Storage Admin should fix the issue.

dcow
  • 7,765
  • 3
  • 45
  • 65
1

UI method:

  1. Add rights on https://console.cloud.google.com/iam-admin/iam
  2. Role is "Storage Admin" as explained in [1]
  3. Then refresh your token with:
    gcloud auth login
    gcloud config set project <PROJECT_ID_HERE>
  1. Push again

Links that can help: [1] https://cloud.google.com/storage/docs/access-control/iam-roles [2] https://cloud.google.com/container-registry/docs/access-control#grant

1

After being very annoyed by how gcloud's docs don't cover the standard use case of creating a service account that can push images to gcr.io, I decided to make this:

The following is meant to be a tutorial (how to guide optimized for learning) on how to use bash/zsh copy pasteable automation to create a service account from scratch with image pull and image push rights to gcr.io.

I turned this into a git gist for easier sharing: https://gist.github.com/neoakris/bd53146a7a610253abdbc1234ffb357b

See here for speed run version of the same: https://gist.github.com/neoakris/f1c4b329901811360ce6269ca631c80b

# Set Input Vars
export SA_SHORT_NAME=gcr-sa
export PROJECT=mygcpproject
export SA_NAME=$SA_SHORT_NAME@$PROJECT.iam.gserviceaccount.com
export SA_KEY_FILE=$HOME/Downloads/$SA_SHORT_NAME.auth.json
export TEST_IMAGE=gcr.io/$PROJECT/test

################################################################
# Prep work that will help make the full process easier to understand

# Login to your account with rights
gcloud auth login

# delete docker config for testability
mv $HOME/.docker/config.json   $HOME/.docker/old-config.json

# Generate some test data
docker image pull busybox
docker image tag busybox "$TEST_IMAGE"1
docker image tag busybox "$TEST_IMAGE"2
docker image push "$TEST_IMAGE"1
# Unauthorized is expected, this is to improve testability

# Config docker as your admin user to upload test data
gcloud auth configure-docker

# Verify your human account can create an image
docker image push "$TEST_IMAGE"1

# delete docker config for testability
rm $HOME/.docker/config.json

################################################################
# Creating an SA with Image pull rights

# Create SA
gcloud iam service-accounts create $SA_SHORT_NAME --description="SA for GCR" --display-name="$SA_SHORT_NAME" --project=chrism-playground

# Create SA Auth Key in Downloads Folder
gcloud iam service-accounts keys create $SA_KEY_FILE --iam-account=$SA_NAME

# Add Image Pull rights
gcloud projects add-iam-policy-binding $PROJECT --member=serviceAccount:$SA_NAME --role=roles/containerregistry.ServiceAgent

# Verify Rights
gcloud projects get-iam-policy $PROJECT  \
--flatten="bindings[].members" \
--format='table(bindings.role)' \
--filter="bindings.members:$SA_NAME"

# Login as the SA
gcloud auth activate-service-account $SA_NAME --key-file=$SA_KEY_FILE

# Config docker as your SA user
gcloud auth configure-docker

# Verify that SA can pull from but not push to gcr.io
docker image pull "$TEST_IMAGE"1
# ^-- pull successful
docker image push "$TEST_IMAGE"2
# ^-- push complains about permissions

###################################################################
# Adding Image Push rights to SA

# Logout of all accounts (human and service accounts)
gcloud auth revoke --all
rm $HOME/.docker/config.json
docker image pull "$TEST_IMAGE"1
# ^-- fails, which verifies that this requires authentication

# Login as human to add image push rights to SA
gcloud auth login

# Alternative method of docker login as the SA:
# (It allows us to be logged in as SA from docker perspective, but logged 
# in as human from gcloud perspective, which makes testing easier.)
cat $SA_KEY_FILE | docker login -u _json_key --password-stdin https://gcr.io

# Confirm that we have the same access as SA from docker perspective
# but admin access from gcloud perspective
docker image pull "$TEST_IMAGE"1
# ^-- pull successful (expected for SA)
docker image push "$TEST_IMAGE"2
# ^-- push complains about permissions (expected for SA)

gcloud projects get-iam-policy $PROJECT  \
--flatten="bindings[].members" \
--format='table(bindings.role)' \
--filter="bindings.members:$SA_NAME"
# Shows the SA has roles/containerregistry.ServiceAgent
# ^-- this is an admin operation which shows gcloud is using human user rights

# Add Image Push rights
gcloud projects add-iam-policy-binding $PROJECT --member=serviceAccount:$SA_NAME --role=roles/storage.admin
gcloud projects add-iam-policy-binding $PROJECT --member=serviceAccount:$SA_NAME --role=roles/storage.objectViewer

# Test that SA has image push rights
docker image push "$TEST_IMAGE"2
# success
neoakris
  • 4,217
  • 1
  • 30
  • 32
0

I had a hard time figuring this out.

Although the error message was the same, my issue was that i was using the project name and not the project ID in the Image URL.

Natalia C
  • 346
  • 3
  • 8
0

I created a separate service account to handle GCR IO. Added Artifact Registry Administrator role (I need to push and pull images) and it started to push the images again to GCR

DK_DEV
  • 51
  • 3
0

docker push command will return this permission error if docker is not authenticated with grc.io

Follow below steps.

  1. Create a service account (or use an existing one) and grant following privileges

    • Storage Admin
    • Storage Object Admin
  2. Generate a service account key (JSON) and download it

  3. Run docker-credential-gcr configure-docker

  4. Docker login with service account

    docker login -u _json_key -p "$(cat [SERVICE_ACCOUNT_KEY.json])" https://gcr.io

  5. Try to push your docker image to gcr

    docker push gcr.io/<project_id>/<image>:<tag>

Thushan
  • 1,220
  • 14
  • 14
0

Pushing images requires object read and write permissions as well as the storage.buckets.get permission. The Storage Object Admin role does not include the storage.buckets.get permission, but the Storage Legacy Bucket Writer role does. You can find this under a note https://cloud.google.com/container-registry/docs/access-control

So Adding the Storage Legacy Bucket Writer Role fixed for me. As the Storage Object Admin role doesnt have required storage.buckets.get Permission.

  • Please provide an explanation of your answer so that the next user knows why this solution worked for you. Also, sum up your answer in case the link stops working in the future. – Elydasian Jul 21 '21 at 08:38
0

What worked for me was going to google cloud console -> I AM & Admin -> Setting storage admin as one of the roles for the service account .

I.Tyger
  • 715
  • 1
  • 8
  • 19
0

Spent too long figuring this out as I was pushing images from one project to another (intentionally) and didn't clock which bucket I needed to grant access to.

Details are as per https://cloud.google.com/container-registry/docs/access-control.

In summary, to push images using the Cloud Build container registry you need to grant Storage Legacy Bucket Writer to the {projectB-id}@cloudbuild.gserviceaccount.com service account on the {storage-region.}artifacts.{projectA-name}.appspot.com bucket.

barclakj
  • 41
  • 1
  • 4
0

In my case, this error was caused by the Storage API (used to push Google Container Registry images to) having been put inside a VPC service perimeter.

This can be confirmed and diagnosed further if required by looking through the logs accessible via the VPC Service Controls troubleshooting page.

u-phoria
  • 364
  • 4
  • 13