183

We're working on an application that uses the new firebase cloud functions. What currently is happening is that a transaction is put in the queue node. And then the function removes that node and puts it in the correct node. This has been implemented because of the ability to work offline.

Our current problem is the speed of the function. The function itself takes about 400ms, so that's alright. But sometimes the functions take a very long time (around 8 seconds), while the entry was already added to the queue.

We suspect that the server takes time to boot up, because when we do the action once more after the first. It takes way less time.

Is there any way to fix this problem? Down here i added the code of our function. We suspect there's nothing wrong with it, but we added it just in case.

const functions = require('firebase-functions');
const admin = require('firebase-admin');
const database = admin.database();

exports.insertTransaction = functions.database
    .ref('/userPlacePromotionTransactionsQueue/{userKey}/{placeKey}/{promotionKey}/{transactionKey}')
    .onWrite(event => {
        if (event.data.val() == null) return null;

        // get keys
        const userKey = event.params.userKey;
        const placeKey = event.params.placeKey;
        const promotionKey = event.params.promotionKey;
        const transactionKey = event.params.transactionKey;

        // init update object
        const data = {};

        // get the transaction
        const transaction = event.data.val();

        // transfer transaction
        saveTransaction(data, transaction, userKey, placeKey, promotionKey, transactionKey);
        // remove from queue
        data[`/userPlacePromotionTransactionsQueue/${userKey}/${placeKey}/${promotionKey}/${transactionKey}`] = null;

        // fetch promotion
        database.ref(`promotions/${promotionKey}`).once('value', (snapshot) => {
            // Check if the promotion exists.
            if (!snapshot.exists()) {
                return null;
            }

            const promotion = snapshot.val();

            // fetch the current stamp count
            database.ref(`userPromotionStampCount/${userKey}/${promotionKey}`).once('value', (snapshot) => {
                let currentStampCount = 0;
                if (snapshot.exists()) currentStampCount = parseInt(snapshot.val());

                data[`userPromotionStampCount/${userKey}/${promotionKey}`] = currentStampCount + transaction.amount;

                // determines if there are new full cards
                const currentFullcards = Math.floor(currentStampCount > 0 ? currentStampCount / promotion.stamps : 0);
                const newStamps = currentStampCount + transaction.amount;
                const newFullcards = Math.floor(newStamps / promotion.stamps);

                if (newFullcards > currentFullcards) {
                    for (let i = 0; i < (newFullcards - currentFullcards); i++) {
                        const cardTransaction = {
                            action: "pending",
                            promotion_id: promotionKey,
                            user_id: userKey,
                            amount: 0,
                            type: "stamp",
                            date: transaction.date,
                            is_reversed: false
                        };

                        saveTransaction(data, cardTransaction, userKey, placeKey, promotionKey);

                        const completedPromotion = {
                            promotion_id: promotionKey,
                            user_id: userKey,
                            has_used: false,
                            date: admin.database.ServerValue.TIMESTAMP
                        };

                        const promotionPushKey = database
                            .ref()
                            .child(`userPlaceCompletedPromotions/${userKey}/${placeKey}`)
                            .push()
                            .key;

                        data[`userPlaceCompletedPromotions/${userKey}/${placeKey}/${promotionPushKey}`] = completedPromotion;
                        data[`userCompletedPromotions/${userKey}/${promotionPushKey}`] = completedPromotion;
                    }
                }

                return database.ref().update(data);
            }, (error) => {
                // Log to the console if an error happened.
                console.log('The read failed: ' + error.code);
                return null;
            });

        }, (error) => {
            // Log to the console if an error happened.
            console.log('The read failed: ' + error.code);
            return null;
        });
    });

function saveTransaction(data, transaction, userKey, placeKey, promotionKey, transactionKey) {
    if (!transactionKey) {
        transactionKey = database.ref('transactions').push().key;
    }

    data[`transactions/${transactionKey}`] = transaction;
    data[`placeTransactions/${placeKey}/${transactionKey}`] = transaction;
    data[`userPlacePromotionTransactions/${userKey}/${placeKey}/${promotionKey}/${transactionKey}`] = transaction;
}
Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Stan van Heumen
  • 2,236
  • 2
  • 11
  • 20

7 Answers7

158

firebaser here

It sounds like you're experiencing a so-called cold start of the function.

When your function hasn't been executed in some time, Cloud Functions puts it in a mode that uses fewer resources so that you don't pay for compute time that you're not using. Then when you hit the function again, it restores the environment from this mode. The time it takes to restore consists of a fixed cost (e.g. restore the container) and a part variable cost (e.g. if you use a lot of node modules, it may take longer).

We're continually monitoring the performance of these operations to ensure the best mix between developer experience and resource usage. So expect these times to improve over time.

The good news is that you should only experience this during development. Once your functions are being frequently triggered in production, chances are they'll hardly ever hit a cold start again, especially if they have consistent traffic. If some functions tend to see spikes of traffic, however, you'll still see cold starts for every spike. In that case, you may want to consider the minInstances setting to keep a set number of instances of a latency-critical function warm at all times.

Jeff
  • 2,425
  • 1
  • 18
  • 43
Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • 4
    **Moderator Note**: All the off topic comments on this post have been removed. Please use the comments to request clarification or suggest improvements only. If you have a related but different question, [ask a new question](//stackoverflow.com/questions/ask), and include a link to this one to help provide context. – Bhargav Rao Aug 05 '18 at 22:40
69

Update March 2021 It may be worth checking out the answer below from @George43g which offers a neat solution to automating the below process. Note - I haven't tried this myself and so cannot vouch for it, but it seems to automate the process described here. You can read more at https://github.com/gramstr/better-firebase-functions - otherwise read on for how to implement it yourself and understand what is happening inside functions.

Update May 2020 Thanks for the comment by maganap - in Node 10+ FUNCTION_NAME is replaced with K_SERVICE (FUNCTION_TARGET is the function itself, not it's name, replacing ENTRY_POINT). Code samples below have been udpated below.

More info at https://cloud.google.com/functions/docs/migrating/nodejs-runtimes#nodejs-10-changes

Update - looks like a lot of these problems can be solved using the hidden variable process.env.FUNCTION_NAME as seen here: https://github.com/firebase/functions-samples/issues/170#issuecomment-323375462

Update with code - For example, if you have the following index file:

...
exports.doSomeThing = require('./doSomeThing');
exports.doSomeThingElse = require('./doSomeThingElse');
exports.doOtherStuff = require('./doOtherStuff');
// and more.......

Then all of your files will be loaded, and all of those files' requirements will also be loaded, resulting in a lot of overhead and polluting your global scope for all of your functions.

Instead separating your includes out as:

const function_name = process.env.FUNCTION_NAME || process.env.K_SERVICE;
if (!function_name || function_name === 'doSomeThing') {
  exports.doSomeThing = require('./doSomeThing');
}
if (!function_name || function_name === 'doSomeThingElse') {
  exports.doSomeThingElse = require('./doSomeThingElse');
}
if (!function_name || function_name === 'doOtherStuff') {
  exports.doOtherStuff = require('./doOtherStuff');
}

This will only load the required file(s) when that function is specifically called; allowing you to keep your global scope much cleaner which should result in faster cold-boots.


This should allow for a much neater solution than what I've done below (though the explanation below still holds).


Original Answer

It looks like requiring files and general initialisation happening in the global scope is a huge cause of slow-down during cold-boot.

As a project gets more functions the global scope is polluted more and more making the problem worse - especially if you scope your functions into separate files (such as by using Object.assign(exports, require('./more-functions.js')); in your index.js.

I've managed to see huge gains in cold-boot performance by moving all my requires into an init method as below and then calling it as the first line inside any function definition for that file. Eg:

const functions = require('firebase-functions');
const admin = require('firebase-admin');
// Late initialisers for performance
let initialised = false;
let handlebars;
let fs;
let path;
let encrypt;

function init() {
  if (initialised) { return; }

  handlebars = require('handlebars');
  fs = require('fs');
  path = require('path');
  ({ encrypt } = require('../common'));
  // Maybe do some handlebars compilation here too

  initialised = true;
}

I've seen improvements from about 7-8s down to 2-3s when applying this technique to a project with ~30 functions across 8 files. This also seems to cause functions to need to be cold-booted less often (presumably due to lower memory usage?)

Unfortunately this still makes HTTP functions barely usable for user-facing production use.

Hoping the Firebase team have some plans in future to allow for proper scoping of functions so that only the relevant modules ever need to be loaded for each function.

Tyris
  • 972
  • 1
  • 8
  • 10
  • Hey Tyris, I'm facing same issue with time operation, I'm trying to implement your solution. just trying to understand, who call to init function and when? – Manspof Apr 22 '18 at 07:04
  • Hi @AdirZoari, my explanation of using init() and so forth is probably not best practice; its value is just to demonstrate my findings about the core problem. You'd be much better off looking at the hidden variable `process.env.FUNCTION_NAME` and using that to conditionally include the files required for that function. The comment at https://github.com/firebase/functions-samples/issues/170#issuecomment-323375462 gives a really good description of this working! It ensures that the global scope isn't polluted with methods and includes from irrelevant functions. – Tyris Apr 23 '18 at 07:28
  • This sounds promising! I do notice an immediate decrease in response times (good!). I am wondering, do you think this also prevents duplicate functions running in parallel? My case is a Dialogflow app for Google Assistant. I had a hunch that my user session would 'break' if many of my other functions would be called (potentially due to a change of instance (or session ID)). Looking forward to hearing your thoughts! – davidverweij Mar 08 '19 at 11:29
  • 1
    Hi @davidverweij, I don't think this will help in terms of the possibility of your functions running twice or in parallel. Functions auto scale as needed so multiple functions (the same function, or different ones) can be running in parallel at any time. This means you do have to consider data safety and consider using transactions. Also, check out this article on your functions possibly running twice: https://cloud.google.com/blog/products/serverless/cloud-functions-pro-tips-retries-and-idempotency-in-action – Tyris Mar 16 '19 at 01:51
  • 1
    Notice `FUNCTIONS_NAME` is only valid with node 6 and 8, as explained here: https://cloud.google.com/functions/docs/env-var#environment_variables_set_automatically . Node 10 should use `FUNCTION_TARGET ` – maganap Apr 29 '20 at 13:42
  • 1
    Thanks for update @maganap, looks like it should use `K_SERVICE` according to doco at https://cloud.google.com/functions/docs/migrating/nodejs-runtimes#nodejs-10-changes - I've updated my answer. – Tyris May 01 '20 at 03:28
  • Thanks for updating with K_SERVICE +1 – Ayyappa Jul 31 '20 at 11:14
  • That looks neat, but wouldn't it lead to 404 if someone calls 'doSomeThing' and then 'doSomeThingElse' right after the first call finishes? I guess it won't be the problem if each call triggers creation of a new server instance since it will init everything from scratch for each call, but firebase can reuse server instances to minimize cold start. And if the server instance was inited with 'doSomeThing' then the rest of the functions won't be included and calling the same server instance with 'doSomeThingElse' will fail with 404. Or not? What I'm missing here if it's not the case? – vir us Mar 08 '21 at 12:17
  • 1
    @virus - functions spin up a unique instance for each registered function. That's why this works, otherwise you'd have the issues you describe. This is also why you can deploy each function independently, and why each function listed in the Firebase Console can have different node versions assigned to run them. Ironically if all functions ran in a single server instance, this cold-start issue would be significantly less prevalent since the server instance(s) would spin down less often. – Tyris Mar 09 '21 at 23:25
9

I am facing similar issues with firestore cloud functions. The biggest is performance. Specially in case of early stage startups, when you can't afford your early customers to see "sluggish" apps. A simple documentation generation function for e.g gives this:

-- Function execution took 9522 ms, finished with status code: 200

Then: I had a straighforward terms and conditions page. With cloud functions the execution due to the cold start would take 10-15 seconds even at times. I then moved it to a node.js app, hosted on appengine container. The time has come down to 2-3 seconds.

I have been comparing many of the features of mongodb with firestore and sometimes I too wonder if during this early phase of my product I should also move to a different database. The biggest adv I had in firestore was the trigger functionality onCreate, onUpdate of document objects.

https://db-engines.com/en/system/Google+Cloud+Firestore%3BMongoDB

Basically if there are static portions of your site that can be offloaded to appengine environment, perhaps not a bad idea.

Sudhakar R
  • 597
  • 7
  • 18
  • 2
    I don't think Firebase Functions are fit for purpose as far as display dynamic user facing content. We use a few HTTP functions sparingly for things like password reset, but in general if you have dynamic content, serve it elsewhere as an express app (or use a diff language). – Tyris Mar 16 '19 at 01:53
5

UPDATE: 2022 - lib is maintained again. Firebase now has the ability to keep instances warm, however there's still potential performance & code structure benefits.

UPDATE/EDIT: new syntax and updates coming MAY2020

I just published a package called better-firebase-functions, it automatically searches your function directory and correctly nests all the found functions in your exports object, while isolating the functions from each other to improve cold-boot performance.

If you lazy-load and cache only the dependencies you need for each function within the module scope, you'll find it's the simplest and easiest way to keep your functions optimally efficient over a fast-growing project.

import { exportFunctions } from 'better-firebase-functions'
exportFunctions({__filename, exports})
George43g
  • 565
  • 10
  • 20
3

I have done these things as well, which improves performance once the functions are warmed up, but the cold start is killing me. One of the other issues I've encountered is with cors, because it takes two trips to the cloud functions to get the job done. I'm sure I can fix that, though.

When you have an app in its early (demo) phase when it is not used frequently, the performance is not going to be great. This is something that should be considered, as early adopters with early product need to look their best in front of potential customers/investors. We loved the technology so we migrated from older tried-and-true frameworks, but our app seems pretty sluggish at this point. I'm going to next try some warm-up strategies to make it look better

  • 1
    We are testing a cron-job to wake every single function up. Maybe this approach helps you too. – Jesús Fuentes Jul 11 '18 at 10:30
  • 1
    hey @JesúsFuentes I was just wondering if waking the function worked for you. Sounds like a crazy solution :D – AlexZvl Jul 26 '18 at 20:55
  • 2
    Hi @Alexandr, sadly we didn't have time to do it yet, but it's in our top priority list. It should work theoretically, though. The problem comes with onCall functions, which require to be launched from a Firebase App. Maybe calling them from the client every X mins? We'll see. – Jesús Fuentes Jul 27 '18 at 08:22
  • I have similar situation. It takes 3 seconds to run the function. Which is too much. I removed all dependencies. I tried to allocate 512 MB, and its just a bit better, but still unacceptable. Now I think of minimizing the payload. I hope it will make it a bit faster. @JesúsFuentes – AlexZvl Jul 27 '18 at 08:32
  • 1
    @Alexandr shall we have a conversation outside Stackoverflow? We might help each other with new approaches. – Jesús Fuentes Jul 27 '18 at 08:44
  • 1
    @Alexandr we didn't test yet this 'wakeup' workaround but we already deployed our functions to europe-west1. Still, unacceptable times. – Jesús Fuentes Aug 07 '18 at 15:49
  • @JesúsFuentes, how I can reach you ? give me your email please – AlexZvl Aug 08 '18 at 08:03
  • @Alexandr we can try https://chat.stackexchange.com/rooms/118/root-access (let's remove all these msgs once we get in touch). – Jesús Fuentes Aug 08 '18 at 09:25
  • It did help us. It took a bit to instrument it, making sure the app was not still pinging the cloud functions when it was in the background, but overall, it worked around the cold-start latency. – Stan Swiniarski Sep 13 '18 at 00:05
2

I experienced a very poor performance for my first project in Firebase Functions, where a simple function would be executed in minutes (knowing the 60s limit for function execution, I knew there was something wrong with my functions). The issue for my case is I didn't properly terminate the function

In case someone experienced the same issue, make sure to terminate the function by either:

  1. Sending a response for HTTP triggers
  2. Returning a promise for background triggers

Here's the youtube link from Firebase that helped me solve the issue

Fransiska
  • 429
  • 5
  • 8
0

Cloud Functions have inconsistent cold start times when used with firestore libraries due to the gRpc libraries used within it.

We recently made a full compatible Rest client (@bountyrush/firestore) which is aimed to get updated in parallel to official nodejs-firestore client.

Fortunately, the cold starts are much better now and we even dropped using redis memory store as cache which we used earlier.

Steps to integrate:

1. npm install @bountyrush/firestore
2. Replace require('@google-cloud/firestore') with require('@bountyrush/firestore')
3. Have FIRESTORE_USE_REST_API = 'true' in your environment variables. (process.env.FIRESTORE_USE_REST_API should be set to 'true' for using in rest mode. If its not set, it just standard firestore with grpc connections)
Ayyappa
  • 1,876
  • 1
  • 21
  • 41