15

Since this morning, our Firebase application has a problem when writing data to the Realtime Database instance. Even the simplest task, such as adding one key-value pair to an object triggers

Error: TRIGGER_PAYLOAD_TOO_LARGE: This request would cause a function payload exceeding the maximum size allowed.

It is especially strange since nothing in our code or database has changed for more than 24 hours.

Even something as simple as

Database.ref('environments/' + envkey).child('orders/' + orderkey).ref.set({a:1})

triggers the error.

Apperently, the size of the payload is not the problem, but what could be causing this?

Database structure, as requested

environments +-env1 +-env2 --+orders ---+223344 -----customer: "Peters" -----country: "NL" -----+items ------item1 -------code: "a" -------value: "b" ------item2 -------code: "x" -------value: "2"

Niek Oost
  • 628
  • 5
  • 15
  • Could you edit your question to include what your database structure looks like in general, and what your database trigger looks like? – Doug Stevenson May 24 '18 at 15:45
  • @DougStevenson I have added an example structure. This is not a database trigger, some Angular triggers a function that uses a ref to get to a location where data is being updated – Niek Oost May 24 '18 at 15:55
  • Furthermore, we use firebase/5.0.3 and angularfire v2.3.0 – Niek Oost May 24 '18 at 16:09
  • We just started getting this error response as well – Dennis Smolek May 25 '18 at 05:18
  • Ok this is really wack.. Running through the logs to see if the depth was the issue and I'm only 10 levels deep. I started moving up and I can write higher up. I then just tried writing AT ALL at the level and it's blocking now. I cant delete data nor write to it. path: /collections/data/id/items/abc/writeMe tried writing: true and firebase rejected the write – Dennis Smolek May 25 '18 at 05:57

2 Answers2

12

Ok I figured this out. The issue is not related to your write function, but to one of the cloud functions the write action would trigger.

For example, we have a structure like: /collections/data/abcd/items/a in JSON:

"collections": {
    "data": {
        "abc": {
            "name": "example Col",
            "itemCount": 5,
            "items": {
                "a": {"name": "a"},
                "b": {"name": "b"},
                "c": {"name": "c"},
                "d": {"name": "d"},
                "e": {"name": "e"},
            }
        }
    }
}

Any write into an item was failing at all whatsoever. API, Javascript, even a basic write in the console.

I decided to look at our cloud functions and found this:

  const countItems = (collectionId) => {
  return firebaseAdmin.database().ref(`/collections/data/${collectionId}/items`).once('value')
    .then(snapshot => {
      const items = snapshot.val();
      const filtered = Object.keys(items).filter(key => {
        const item = items[key];
        return (item && !item.trash);
      });

      return firebaseAdmin.database().ref(`/collections/meta/${collectionId}/itemsCount`)
        .set(filtered.length);
    });
};

export const onCollectionItemAdd = functions.database.ref('/collections/data/{collectionId}/items/{itemId}')
  .onCreate((change, context) => {
    const { collectionId } = context.params;
    return countItems(collectionId);
  });

On it's own it's nothing, but that trigger reads for ALL items and by default firebase cloud functions send's the entire snapshot to the CF even if we don't use it. In Fact it sends the previous and after values too, so if you (like us) have a TON of items at that point my guess it the payload that it tries to send to the cloud function is way too big.

I removed the count functions from our CF and boom, back to normal. Not sure the "correct" way to do the count if we can't have the trigger at all, but I'll update this if we do...

Dennis Smolek
  • 8,480
  • 7
  • 30
  • 39
  • Thank you very much, indeed it was a cloud function that triggered on a node that was too close to the root of the database. – Niek Oost May 25 '18 at 09:17
  • 1
    RE: counting - We have used used REST requests with "shallow" to pull this off (even from cloud functions), then you only need to listen for *adds* and *removes* – Mike May 25 '18 at 16:20
4

The TRIGGER_PAYLOAD_TOO_LARGE error is part of a new feature Firebase is rolling out, where our existing RTDB limits are being strictly enforced. The reason for the change is to make sure that we aren't silently dropping any Cloud Functions triggers, since any event exceeding those limits can't be sent to Functions.

You can turn this feature off yourself by making this REST call:

curl -X PUT -d "false" https://<namespace>.firebaseio.com/.settings/strictTriggerValidation/.json?auth\=<SECRET>

Where <SECRET> is your DB secret

Note that if you disable this, the requests that are currently failing may go through, but any Cloud Functions you have that trigger on the requests exceeding our limits will fail to run. If you are using database triggers for your functions, I would recommend you re-structure your requests so that they stay within the limits.

Kiana
  • 1,415
  • 11
  • 17