0

I am using Firebase Functions and I have a pubsub function that runs every night. This function is responsible for doing some processing on every user's account (such as preparing some data for them for the next day and sending them an email).

The issue I'm having is that there is a time limit on how long a function can take to run, which for event-driven functions, is 10 minutes. https://firebase.google.com/docs/functions/quotas#:~:text=Time%20Limits,-Quota&text=60%20minutes%20for%20HTTP%20functions,minutes%20for%20event%2Ddriven%20functions.

Now that my number of users has scaled significantly, this limit is no longer sufficient to complete the work for all users.

To help with this, I created a firestore.onWrite event function that fires when a certain document is written. Now, when my pubsub function runs, it writes to a document in firestore, which then triggers my onWrite function to run. This seems to have allowed me to process more users than previously, However, 1) this just feels wrong, and 2) it's still not sufficient as there are rate-limits to writing documents that I've run into as well and I just can't squeeze all my users into the time limit allowed.

What is the correct approach for doing nightly batches (on user accounts), preferably within the Firebase ecosystem?

Eric
  • 2,098
  • 4
  • 30
  • 44

1 Answers1

0

As you correctly pointed out and as it is mentioned in the documentation, the maximum amount of time a function can run before being forcibly terminated is 10 mins for an event-driven function.

However, in order to circumvent that limitation I found this SO post of another person that had a similar issue to you.Most of the recommendations are batching it under 500 and then committing.

Here, have a look at the solution here.

Hope this helps.

Mousumi Roy
  • 609
  • 1
  • 6