1

I have a google cloud function in python to deploy new google cloud functions (from a .zip file my cloud storages) inside the same project. The functions contains the following code:

import requests
import json


def make_func(request):


    # Get the access token from the metadata server
    metadata_server_token_url = 'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/token?scopes=https://www.googleapis.com/auth/cloud-platform'
    token_request_headers = {'Metadata-Flavor': 'Google'}
    token_response = requests.get(metadata_server_token_url, headers=token_request_headers)
    token_response_decoded = token_response.content.decode("utf-8")
    jwt = json.loads(token_response_decoded)['access_token']

    # Use the api you mentioned to create the function
    response = requests.post('https://cloudfunctions.googleapis.com/v1/projects/my-project/locations/us-central1/functions',
                               json={"name":"projects/my-project/locations/us-central1/functions/funct","runtime":"python37","sourceArchiveUrl":"gs://functions/main.zip","entryPoint":"hello_world","httpsTrigger": {} },
                               headers={'Accept': 'application/json', 
                                        'Content-Type': 'application/json',
                                        'Authorization': 'Bearer {}'.format(jwt)} )   
    if response:
         return 'Success! Function Created'
    else:
         return str(response.json())  

I want to add some kind of quota/limit the these new functions. This can be the number of invocations, invocations/minute, spend on this function, etc. Do you have any idea how I can add a quota? And how I can add this to my Python script?

Thanks!

Siem Peters
  • 193
  • 1
  • 12
  • 2
    If you want to implement this, it's going to be a lot more than just a couple lines to paste into your code. Since Cloud Functions invocations are stateless, you will need to implement some sort of persistent counter using another Cloud product, to check on every invocation. You also have no way of limiting total spend, other than the generalized GCP per-project billing alerts that are triggered by total spend by all products in the entire project. – Doug Stevenson Mar 15 '20 at 20:36

2 Answers2

3

I wrote an article for doing this with Cloud Endpoint.

Something more managed is coming, maybe an announcement to Google Cloud Next in 3 weeks. Stay tuned!

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
1

I was not able to do solve this for Google Cloud functions. However, I found a solution for firebase functions. You can find it in the question over here: Rate limiting for Google/Firebase cloud functions?

Siem Peters
  • 193
  • 1
  • 12