4

I'm willing to deploy a service in Google-Cloud-Run. It would be a (python) Flask App that would connect to datastore (firestore in datastore mode) to either write or read a small blob.

The problem is that it is not explained in the docs: Accessing your Database how to reach datastore within GCP but not from GCE or AppEngine. Is there a fancy/seamless way to achieve this or should I go with providing a service account credentials as if it was an external platform ?

Thank you in advance for your help and answers.

wescpy
  • 10,689
  • 3
  • 54
  • 53
R.E.B Hernandez
  • 147
  • 2
  • 9

2 Answers2

5

When your Cloud Run logic executes, it executes with the identity of a GCP Service Account. You can configure which service account it runs as at configuration time. You can create and configure a Service Account that has the correct roles to allow/provide access to your datastore. This means that when your Cloud Run logic executes, it will have the correct authority to perform the desired operations. This story is documented here:

Using per-service identity

If for some reason you don't find this sufficient, an alternative is to save the tokens necessary for access in compute metadata and then dynamically retrieve these explicitly within your cloud run logic. This is described here:

Fetching identity and access tokens

Hopefully this covers the fundamentals of what you are looking for. If after reading these areas new questions arise, feel very free to create new questions which are more specific and detailed and we'll follow up there.

Kolban
  • 13,794
  • 3
  • 38
  • 60
  • Thank you for your answer Kolban, I was expecting that. I will go on and try it out with the service account management and configuration solution. – R.E.B Hernandez Oct 29 '19 at 09:18
2

To connect to Cloud Datastore from your Flask app deployed to Cloud Run...

  1. Ensure you've got both services enabled in a project with an active billing account.
  2. Ensure you've got at least both Flask & Datastore packages in your requirements.txt file (w/any desired versioning):
flask
google-cloud-datastore
  1. Integrate Datastore usage into your app... here's some sample usage in my demo main.py (Flask code dropped for simplicity):
from google.cloud import datastore

ds_client = datastore.Client()
KEY_TYPE = 'Record'

def insert(**data):
    entity = datastore.Entity(key=ds_client.key(KEY_TYPE))
    entity.update(**data)  ## where data = dict/JSON of key-value pairs
    ds_client.put(entity)

def query(limit):
    return ds_client.query(kind=KEY_TYPE).fetch(limit=limit)
  1. You can have a Dockerfile (minimal one below), but better yet, skip it and let Google (Cloud Buildpacks) build your container for you so you don't have extra stuff like this to worry about.
FROM python:3-slim
WORKDIR /app
COPY . .
RUN pip install -r requirements.txt
CMD ["python", "main.py"]
  1. Come up with an app/service name SVC_NAME then build & deploy your prototype container with gcloud beta run deploy SVC_NAME --source . --platform managed --allow-unauthenticated. (Think docker build followed by docker push and then docker run, all from 1 command!) If you have a Dockerfile, Buildpacks will use it, but if not, it'll introspect your code and dependencies to build the most efficient container it can.

That's it. Some of you will get distracted by service accounts and making a public/private key-pair, both of which are fine. However to keep things simple, especially during prototyping, just use the default service account you get for free on Cloud Run. The snippet above works without any service account or IAM code present.

BTW, the above is for a prototype to get you going. If you were deploying to production, you wouldn't use the Flask dev server. You'd probably add gunicorn to your requirements.txt and Dockerfile, and you'd probably create a unique service account key w/specific IAM permissions, perhaps adding other requirements like IAP, VPC, and/or a load-balancer.

wescpy
  • 10,689
  • 3
  • 54
  • 53