在Firebase中加载批量数据 [英] Loading Bulk data in Firebase

查看:111
本文介绍了在Firebase中加载批量数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正尝试使用 set api在firebase中设置对象。该对象相当大,序列化的json大小为2.6 MB。根节点有大约90个孩子,在json树中大约有10000个节点。



set api似乎挂起,不叫回调。
这似乎也导致了firebase实例的问题。



有关如何解决这个问题的想法?


$ b $ =h2_lin>解决方案

b

有一些工具可用于帮助导入大数据,例如 firebase-streaming-import 。他们在内部做的事情也可以很容易地设计出来,用来做自己的事情:

1)获取一个没有下载所有数据的密钥列表,使用 GET请求 shallow = true 。 2)在某种节制方式下,使用 PUT请求或API的 set()方法。

请记住,请求中的字节数和请求的频率将会影响其他人查看应用程序的性能,并且还会计入您的带宽。

一个好的经验法则是,在导入期间,您不希望每秒执行超过100次的写入操作,最好是低于20,这样可以最大限度地提高其他用户的实时速度,并且您应该将数据块保持在低MB - 当然不是GB块。请记住,所有这一切都必须通过互联网。


I am trying to use the set api to set an object in firebase. The object is fairly large, the serialized json is 2.6 mb in size. The root node has around 90 chidren, and in all there are around 10000 nodes in the json tree.

The set api seems to hang and does not call the callback. It also seems to cause problems with the firebase instance.

Any ideas on how to work around this?

解决方案

Since this is a commonly requested feature, I'll go ahead and merge Robert and Puf's comments into an answer for others.

There are some tools available to help with big data imports, like firebase-streaming-import. What they do internally can also be engineered fairly easily for the do-it-yourselfer:

1) Get a list of keys without downloading all the data, using a GET request and shallow=true. Possibly do this recursively depending on the data structure and dynamics of the app.

2) In some sort of throttled fashion, upload the "chunks" to Firebase using PUT requests or the API's set() method.

The critical components to keep in mind here is that the number of bytes in a request and the frequency of requests will have an impact on performance for others viewing the application, and also count against your bandwidth.

A good rule of thumb is that you don't want to do more than ~100 writes per second during your import, preferably lower than 20 to maximize your realtime speeds for other users, and that you should keep the data chunks in low MBs--certainly not GBs per chunk. Keep in mind that all of this has to go over the internets.

这篇关于在Firebase中加载批量数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆