1

I am not able to copy files from VM to CLoud storage bucket on GCP.

Here is what I tried.

  • Created VM Instace and allowed Full access APIs did not work then gave full access individually still not working.
  • Added a file in it.
  • Created a bucket
  • Tried copying file from VM to bucket

Here is the code snippet from terminal

learn_gcp@earthquakevm:~$ ls
test.text  training-data-analyst
learn_gcp@earthquakevm:~$ gsutil cp test.text gs://kukroid-gcp
Copying file://test.text [Content-Type=text/plain]...
AccessDeniedException: 403 Provided scope(s) are not authorized                 
learn_gcp@earthquakevm:~$ 

           

My VM details:enter image description here My Bucket Details enter image description here Can anyone suggest what am I missing? how to fix this?

Alex G
  • 1,179
  • 3
  • 15
kukroid
  • 420
  • 6
  • 15
  • Does this answer your question? [gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE](https://stackoverflow.com/questions/27275063/gsutil-copy-returning-accessdeniedexception-403-insufficient-permission-from) ...while your "VM details" barely matter, as they only show the enabled APIs, but not the scopes the service account has access to. – Martin Zeitler Sep 10 '21 at 05:05

2 Answers2

1

Maybe Your VM still uses cached credential which access scope has not changed.

Trying to delete ~/.gstuil directory and perform gsutil again.

SeungwooLee
  • 959
  • 3
  • 12
1

The error 403 Provided scope(s) are not authorized, shows that the service account you're using to copy doesn't have permission to write an object to the kukroid-gcp bucket.

And based on your screenshot, you are using the Compute Engine default service account and by default it does not have access to the bucket. To make sure your service account has the correct scope you can use curl to query the GCE metadata server:

curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"

You can add the result of that command on your post as an additional information. If there are no scopes that gives you access to storage bucket, then you will need to add the necessary scopes. You can read about scopes here. And you can see the full list of available scopes here

Another workaround is to create a new service account instead of using the default. To do this, here is the step by step process without deleting the existing VM:

  1. Stop the VM instance
  2. Create a new service account IAM & Admin > Service Accounts > Add service account
  3. Create a new service account with the Cloud Storage Admin role
  4. Create private key for this service account
  5. After creating the new service account, go to vm and click on it's name > then click on edit
  6. Now, in the editing mode, scroll down to the service-accounts section and select the new service account.
  7. Start your instance, then try to copy the file again
Alex G
  • 1,179
  • 3
  • 15