I've been repetitively hitting my head against the proverbial brick wall of GCP's Storage API.
I'm trying to apply the django-storages module to connect with a GCP bucket for my static files and anything else I want to use it for in the future.
According to the django-storages documentation (https://django-storages.readthedocs.io/en/latest/backends/gcloud.html#usage), if you are running in the GCP virtual environment, you set your service account to have Storage permissions via the IAM interface and everything should work like tickety-boo.
So, my GCP cloud build runner builds the docker images then runs python manage.py migrate
and python manage.py collectstatic
before deploying my docker image to CloudRun. The build runner uses a service account called XXXX@cloudbuild.gserviceaccount.com
, so going into IAM, I add the “Cloud storage – Storage admin” role, and just to be sure, I also add the “Cloud storage – Storage object admin” role.
Now I trigger a re-run of my cloudbuild and ... at the migrate stage I receive the error:
...
Step #2 - "apply migrations": File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
Step #2 - "apply migrations": return _bootstrap._gcd_import(name[level:], package, level)
Step #2 - "apply migrations": File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
Step #2 - "apply migrations": File "<frozen importlib._bootstrap>", line 991, in _find_and_load
Step #2 - "apply migrations": File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
Step #2 - "apply migrations": File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
Step #2 - "apply migrations": File "<frozen importlib._bootstrap_external>", line 843, in exec_module
Step #2 - "apply migrations": File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
Step #2 - "apply migrations": File "/src/lang/urls.py", line 20, in <module>
Step #2 - "apply migrations": re_path('favicon.ico$', RedirectView.as_view(url=staticfiles_storage.url('images/apple_touch_icon.png'), permanent=False)),
Step #2 - "apply migrations": File "/usr/local/lib/python3.8/site-packages/storages/backends/gcloud.py", line 290, in url
Step #2 - "apply migrations": return blob.generate_signed_url(
Step #2 - "apply migrations": File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/blob.py", line 620, in generate_signed_url
Step #2 - "apply migrations": return helper(
Step #2 - "apply migrations": File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py", line 550, in generate_signed_url_v4
Step #2 - "apply migrations": ensure_signed_credentials(credentials)
Step #2 - "apply migrations": File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py", line 52, in ensure_signed_credentials
Step #2 - "apply migrations": raise AttributeError(
Step #2 - "apply migrations": AttributeError: you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
Finished Step #2 - "apply migrations"
Huh. I can't seem to authenticate via service worker.
Using code from the google example tutorial on django, I have the following line in my settings.py:
credentials, project_id = google.auth.default()
But I don't do anything with the credentials variable returned. It seems to me the documentation is a little sparse online as to how to access buckets via service accounts. Any insights?