猫鼬+实验室+大量数据 [英] Mongoose + Mlabs + Big amount of data
问题描述
我使用猫鼬,并做出反应以从mlab数据库中获取数据.问题在于,mlab数据库的大小为200 MB,它是超过40万个对象的集合.因此,当我获取时,我遇到了JS错误(内存不足),或者请求在2或3分钟内处于待处理状态.
I use mongoose and react to fetch data from a mlab database. The probleme is that the mlab database have a size of 200 MB, it's a collection of more than 400 000 objects. So when i fetch i have JS error (out of memory) or the request is pending during 2 or 3 mins.
我不知道如何改善这一要求.
I don't know how to improve this request.
我还需要进行一些数据格式化,但我不知道它在后端还是前端会更好.
I also need to do some data formating and i don't know if it's better on back or front end..
我需要您的帮助以找到解决方案.非常感谢
I need your help to find a solution. Thanks a lot
推荐答案
您可以尝试
connection.model("myModel").find(query).limit(limit).skip(offset).exec()
query
将是您搜索的条件
limit
将是您要加载的元素数,
limit
will be the number of elements you want to load,
offset
将是您已经加载并且要跳过的元素的数量
offset
will be the number of elements you already have loaded and you want to skip
这篇关于猫鼬+实验室+大量数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!