21

Under Google Cloud Run, you can select which service account your container is running. Using the default compute service account fails to generate a signed url.

The work around listed here works on Google Cloud Compute -- if you allow all the scopes for the service account. There does not seem to be away to do that in Cloud Run (not that I can find).

https://github.com/googleapis/google-auth-library-python/issues/50

Things I have tried:

  1. Assigned the service account the role: roles/iam.serviceAccountTokenCreator
  2. Verified the workaround in the same GCP project in a Virtual Machine (vs Cloud Run)
  3. Verified the code works locally in the container with the service account loaded from private key (via json file).
from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
blob.generate_signed_url(expiration=expires)

Fails with:

you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py, line 51, in ensure_signed_credentials

Trying to add the workaround,

Error calling the IAM signBytes API: 
{  "error": {  "code": 400,

    "message": "Request contains an invalid argument.",
    "status": "INVALID_ARGUMENT"  }
}
Exception Location: /usr/local/lib/python3.8/site-packages/google/auth/iam.py, line 81, in _make_signing_request

Workaround code as mention in Github issue:

from google.cloud import storage
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta

def get_signing_creds(credentials):
    auth_request = requests.Request()
    print(credentials.service_account_email)
    signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.ser
vice_account_email)
    return signing_credentials


client = storage.Client()
bucket = client.get_bucket('EXAMPLE_BUCKET')
blob = bucket.get_blob('libraries/image_1.png')
expires = datetime.now() + timedelta(seconds=86400)
signing_creds = get_signing_creds(client._credentials)
url = blob.generate_signed_url(expiration=expires, credentials=signing_creds)
print(url)

How do I generate a signed url under Google Cloud Run? At this point, it seems like I may have to mount the service account key which I wanted to avoid.

EDIT: To try and clarify, the service account has the correct permissions - it works in GCE and locally with the JSON private key.

sww314
  • 1,252
  • 13
  • 17
  • Same question as https://stackoverflow.com/questions/64205103/creating-v4-signed-urls-in-cloudrun ? – ahmet alp balkan Oct 06 '20 at 22:31
  • 1
    Same problem that question is for golang vs python. I have added the work around to try and change the access token into JWT, but it will not work in Cloud Run. It does work in GCE Virtual Machine. – sww314 Oct 07 '20 at 01:38

6 Answers6

21

Yes you can, but I had to deep dive to find how (jump to the end if you don't care about the details)

If you go in the _signing.py file, line 623, you can see this

if access_token and service_account_email:
   signature = _sign_message(string_to_sign, access_token, service_account_email)
...

If you provide the access_token and the service_account_email, you can use the _sign_message method. This method uses the IAM service SignBlob API at this line

It's important because you can now sign blob without having locally the private key!! So, that solves the problem, and the following code works on Cloud Run (and I'm sure on Cloud Function)

def sign_url():
    from google.cloud import storage
    from datetime import datetime, timedelta

    import google.auth
    credentials, project_id = google.auth.default()

    # Perform a refresh request to get the access token of the current credentials (Else, it's None)
    from google.auth.transport import requests
    r = requests.Request()
    credentials.refresh(r)

    client = storage.Client()
    bucket = client.get_bucket('EXAMPLE_BUCKET')
    blob = bucket.get_blob('libraries/image_1.png')
    expires = datetime.now() + timedelta(seconds=86400)

    # In case of user credential use, define manually the service account to use (for development purpose only)
    service_account_email = "YOUR DEV SERVICE ACCOUNT"
    # If you use a service account credential, you can use the embedded email
    if hasattr(credentials, "service_account_email"):
        service_account_email = credentials.service_account_email

    url = blob.generate_signed_url(expiration=expires,service_account_email=service_account_email, access_token=credentials.token)
    return url, 200

Let me know if it's not clear

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
  • 1
    Unfortunately, this doesn't seem to work using the default app engine service account inside of a cloud function. It raises an exception: `AttributeError: you need a private key to sign credentials.the credentials you are currently using just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.` – istrupin Jan 04 '21 at 18:24
  • App Engine has slightly different and legacy service account management system and yes, it's not compliant. In fact, ,you don't get the service account email in the default credential and so, it doesn't work (or it works with a service account key file) – guillaume blaquiere Jan 04 '21 at 19:07
  • Sorry -- I may have been been unclear -- in Cloud Functions the above doesn't work when using the Default App Engine service account. `credentials.service_account_email` gets the right email address, but unfortunately the signing does not work. – istrupin Jan 04 '21 at 19:15
  • My bad, you were clear, I read too quickly! Your case is strange! Cloud Run and Cloud Functions share a lot of common components, and the authentication mechanism is the same. It should work exactly in the same way. Can you share the version of your dependencies? – guillaume blaquiere Jan 04 '21 at 19:19
  • Certainly! In my case I'm using google-api-core==1.24.1; google-auth==1.24; google-cloud-core==1.5.0; google-cloud-storage==1.35.0 – istrupin Jan 04 '21 at 19:23
  • The lib code hasn't changed. can you open a new question with a minimal piece of your code that I can reproduce and understand on my side? – guillaume blaquiere Jan 04 '21 at 19:34
  • Certainly -- see https://stackoverflow.com/questions/65568916/generating-cloud-storage-signed-url-from-google-cloud-function-without-using-exp – istrupin Jan 04 '21 at 19:49
  • this is not working for me, it gives me an error. google.auth.exceptions.TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 403,\n "message": "The caller does not have permission",\n "status": "PERMISSION_DENIED"\n }\n}\n' – I-PING Ou Jan 28 '21 at 14:59
  • Worked for me in CloudRun - Thanks. – worldofchris Feb 01 '21 at 07:08
  • In order for this to work, you have to add the role `Service Account Token Creator` to the AppEngine/ComputeEngine _default service account_. That role will allow the _default service account_ to actually sign blobs and create signed URLs. You do still need to refresh the credentials so as to get the `access_token` and use both `credential.service_account_email` and `credential.token` when calling the `generate_signed_url` method for it to work, as per this answer. – Guilherme Coppini Dec 15 '21 at 19:29
  • Unfortunately this isn't working for me on cloud run. `credentials.refresh(r)` raises `TypeError: 'Request' object is not callable`. (In case it's useful, credential.service_account_email was 'default'.) So instead I'm writing a service account credentials json from secret manager and creating client like so: `gcs.Client.from_service_account_json(str(fpath_json.absolute()))`. – Nathan Lloyd Nov 16 '22 at 05:03
2

The answer @guillaume-blaquiere posted here does work, but it requires an additional step not mentioned, which is to add the Service Account Token Creator role in IAM to your default service account, which will allow said default service account to "Impersonate service accounts (create OAuth2 access tokens, sign blobs or JWTs, etc)."

This allows the default service account to sign blobs, as per the signBlob documentation.

I tried it on AppEngine and it worked perfectly once that permission was given.

import datetime as dt

from google import auth
from google.cloud import storage

# SCOPES = [
#     "https://www.googleapis.com/auth/devstorage.read_only",
#     "https://www.googleapis.com/auth/iam"
# ]

credentials, project = auth.default(
#     scopes=SCOPES
)
credentials.refresh(auth.transport.requests.Request())

expiration_timedelta = dt.timedelta(days=1)

storage_client = storage.Client(credentials=credentials)
bucket = storage_client.get_bucket("bucket_name")
blob = bucket.get_blob("blob_name")

signed_url = blob.generate_signed_url(
    expiration=expiration_timedelta,
    service_account_email=credentials.service_account_email,
    access_token=credentials.token,
)

I downloaded a key for the AppEngine default service account to test locally, and in order to make it work properly outside of the AppEngine environment, I had to add the proper scopes to the credentials, as per the commented lines setting the SCOPES. You can ignore them if running only in AppEngine itself.

  • You do need to provide both `service_account_email` and `access_token` as parameters to the `generate_signed_url` function for it to work on the AppEngine environment, and do need to `refresh` the credentials in order to get said `access_token` (before the refresh, `credentials.token = None`). – Guilherme Coppini Dec 15 '21 at 19:31
  • Thanks, this worked for CloudRun for me after activating the IAM API and giving the CloudRun Service Account the Service Account Token Creator role. This answer should be voted higher up imo. – Tim M. Schendzielorz Jul 13 '23 at 11:54
1

An updated approach has been added to GCP's documentation for serverless instances such as Cloud Run and App Engine.

The following snippet shows how to create a signed URL from the storage library.

def generate_upload_signed_url_v4(bucket_name, blob_name):
    """Generates a v4 signed URL for uploading a blob using HTTP PUT.

    Note that this method requires a service account key file. You can not use
    this if you are using Application Default Credentials from Google Compute
    Engine or from the Google Cloud SDK.
    """
    # bucket_name = 'your-bucket-name'
    # blob_name = 'your-object-name'

    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        # This URL is valid for 15 minutes
        expiration=datetime.timedelta(minutes=15),
        # Allow PUT requests using this URL.
        method="PUT",
        content_type="application/octet-stream",
    )


    return url

Once your backend returns the signed URL you could execute curl put request from your frontend as follows

curl -X PUT -H 'Content-Type: application/octet-stream' --upload-file my-file 'my-signed-url' 
Miguel Rueda
  • 478
  • 1
  • 6
  • 13
  • 4
    "You can not use this if you are using Application Default Credentials from Google Compute Engine or from the Google Cloud SDK." So, again, this doesn't work if you don't have a keyfile. – AKX May 18 '21 at 13:03
1

I store the credentials.json contents in Secret Manager then load it in my Django app like this:

project_id = os.environ.get("GOOGLE_CLOUD_PROJECT")
client = secretmanager.SecretManagerServiceClient()
secret_name = "service_account_credentials"
secret_path = f"projects/{project_id}/secrets/{secret_name}/versions/latest"
credentials_json = client.access_secret_version(name=secret_path).payload.data.decode("UTF-8")
service_account_info = json.loads(credentials_json)
google_service_credentials = service_account.Credentials.from_service_account_info(
        service_account_info)

I tried the answer from @guillaume-blaquiere and I added the permission recommended by @guilherme-coppini but when using Google Cloud Run I always saw the same "You need a private key to sign credentials.the credentials you are currently using..." error.

Brett Bond
  • 11
  • 2
  • Is this the right way to generate signed url : `blob.generate_signed_url( version='v4', method='GET', credentials=google_service_credentials)` – Aseem Aug 20 '23 at 17:25
0

You can't sign urls with the default service account.

Try your service code again with a dedicated service account with the permissions, and see if that resolves your error

References and further reading:

glasnt
  • 2,865
  • 5
  • 35
  • 55
  • Thanks. This is not the issue. The service account has the correct permissions - I can get the simple code to work in a GCE VM and locally in a docker container. In Cloud Run, you get an access token vs the json private key. I can not find documentation on how to turn the access token into a JWT. – sww314 Oct 07 '20 at 01:36
  • When I say "default service account" I mean the default compute service account. – glasnt Oct 07 '20 at 05:43
  • You should be able to use `import google.auth; credentials, project_id = google.auth.default()` within a Cloud Run environment. Your workaround code specifically calls "compute_engine" which by name alone may not work in any other environment. – glasnt Oct 07 '20 at 05:44
  • 2
    @glasnt, now you can!! (I'm sure it's pretty recent!) – guillaume blaquiere Oct 07 '20 at 13:29
0

I had to add both Service Account Token Creator and Storage Object Creator to the default compute engine service account (which is what my Cloud Run services use) before it worked. You could also create a custom Role that has just iam.serviceAccounts.signBlob instead of Service Account Token Creator, which is what I did: enter image description here

Gillespie
  • 5,780
  • 3
  • 32
  • 54