Firebase 实时数据库当前给出 TRIGGER_PAYLOAD_TOO_LARGE 错误 [英] Firebase Realtime Database currently gives TRIGGER_PAYLOAD_TOO_LARGE error

查看:19
本文介绍了Firebase 实时数据库当前给出 TRIGGER_PAYLOAD_TOO_LARGE 错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

从今天早上开始,我们的 Firebase 应用程序在将数据写入实时数据库实例时出现了问题..即使是最简单的任务,例如向对象添加一个键值对也会触发

Since this morning, our Firebase application has a problem when writing data to the Realtime Database instance. Even the simplest task, such as adding one key-value pair to an object triggers

错误:TRIGGER_PAYLOAD_TOO_LARGE:此请求将导致函数负载超过允许的最大大小.

这特别奇怪,因为我们的代码或数据库中的任何内容都没有更改超过 24 小时.

It is especially strange since nothing in our code or database has changed for more than 24 hours.

即使是简单的事情

Database.ref('environments/' + envkey).child('orders/' + orderkey).ref.set({a:1})

触发错误.

显然,有效载荷的大小不是问题,但可能是什么原因造成的?

Apperently, the size of the payload is not the problem, but what could be causing this?

数据库结构,按要求

<代码>环境+-env1+-env2--+订单---+223344-----客户:彼得斯"-----国家:NL"-----+物品------第 1 项-------代码:一个"-------值:b"------第2项-------代码:x"-------值:2"

推荐答案

好的,我想通了.该问题与您的写入功能无关,而是与写入操作将触发的云功能之一有关.

Ok I figured this out. The issue is not related to your write function, but to one of the cloud functions the write action would trigger.

例如,我们有这样的结构:/collections/data/abcd/items/a在 JSON 中:

For example, we have a structure like: /collections/data/abcd/items/a in JSON:

"collections": {
    "data": {
        "abc": {
            "name": "example Col",
            "itemCount": 5,
            "items": {
                "a": {"name": "a"},
                "b": {"name": "b"},
                "c": {"name": "c"},
                "d": {"name": "d"},
                "e": {"name": "e"},
            }
        }
    }
}

任何写入项目都失败了.API、Javascript,甚至是在控制台中的基本编写.

Any write into an item was failing at all whatsoever. API, Javascript, even a basic write in the console.

我决定查看我们的云功能并发现:

I decided to look at our cloud functions and found this:

  const countItems = (collectionId) => {
  return firebaseAdmin.database().ref(`/collections/data/${collectionId}/items`).once('value')
    .then(snapshot => {
      const items = snapshot.val();
      const filtered = Object.keys(items).filter(key => {
        const item = items[key];
        return (item && !item.trash);
      });

      return firebaseAdmin.database().ref(`/collections/meta/${collectionId}/itemsCount`)
        .set(filtered.length);
    });
};

export const onCollectionItemAdd = functions.database.ref('/collections/data/{collectionId}/items/{itemId}')
  .onCreate((change, context) => {
    const { collectionId } = context.params;
    return countItems(collectionId);
  });

它本身什么都不是,但是即使我们不使用它,它也会触发读取所有项目,并且默认情况下,firebase 云函数会将整个快照发送到 CF.事实上,它也会发送之前和之后的值,所以如果你(像我们一样)当时有大量的项目,我猜它试图发送到云函数的有效载荷太大了.

On it's own it's nothing, but that trigger reads for ALL items and by default firebase cloud functions send's the entire snapshot to the CF even if we don't use it. In Fact it sends the previous and after values too, so if you (like us) have a TON of items at that point my guess it the payload that it tries to send to the cloud function is way too big.

我从我们的 CF 和吊杆中删除了计数功能,恢复正常.如果我们根本没有触发器,不确定进行计数的正确"方法,但如果我们这样做,我会更新它......

I removed the count functions from our CF and boom, back to normal. Not sure the "correct" way to do the count if we can't have the trigger at all, but I'll update this if we do...

这篇关于Firebase 实时数据库当前给出 TRIGGER_PAYLOAD_TOO_LARGE 错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆