猫鼬QueryStream新结果 [英] Mongoose QueryStream new results

查看:53
本文介绍了猫鼬QueryStream新结果的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试设置 MongooseJS 以推出整个收藏集(或只是最新的收藏集)当其他应用程序将新文档插入到集合中时.

I am trying to setup MongooseJS to push out the whole collection (or just the newest item) when new documents are inserted into the collection by another application.

我认为 QueryStream 是必经之路.

但是,当我启动简单的应用程序时,它会一次读取集合并关闭它.

However, when I start my simple application, it reads out the collection once, and closes it.

当我插入新文档时,什么也没发生(假设连接不再打开并且正在寻找新结果...?)

When I insert a new document nothing happens (assuming the connection is no longer open and looking for new results...?)

var Orders = db.model('orders', OrderSchema);

var stream = Orders.find().stream();

stream.on('data', function(doc){
    console.log('New item!');
    console.log(doc);
}).on('error', function (error){
    console.log(error);
}).on('close', function () {
    console.log('closed');
});

立即打印当前在 orders集合中的所有项目,然后打印已关闭". 集合更改 时,"Stream"是否应保持打开状态以打印新数据?

Immediately prints all the items that are currently in the orders collection, and than prints "closed". Shouldn't the "Stream" remain open printing new data when the collection changes?

我对MongooseJS QueryStream不了解什么?

Ps.我的目标是最终通过socket.io emit进行更新,如下所示:

Ps. my goal is to eventually emit an updated collection via socket.io as demonstrated here: Mongoose stream return few results first time

推荐答案

我发现,为了使此方法有效,我需要将集合更改为capped collection:

I discovered that in order for this method to work I needed to change my collection to a capped collection:

var OrderSchema = new Mongoose.Schema({...
}, { capped: { size: 10, max: 10, autoIndexId: true }});

var Orders = db.model('orders', OrderSchema);

var stream = Orders.find().tailable().stream();

stream.on('data', function(doc){
    console.log('New item!');
    console.log(doc);
}).on('error', function (error){
    console.log(error);
}).on('close', function () {
    console.log('closed');
});

之所以行之有效,是因为我现在可以将MongoDB collection当作消息队列一样对待,并且会不断更新.

This works because I can now treat the MongoDB collection like something of a message queue, which is continuously updated.

当我将其包装在SocketIO事件中时,我感到奇怪的是,我得到了同一documents的倍数,这使我觉得仍有某些事情我做得并不完全正确...

Strangely enough when I wrap this inside of a SocketIO event I get multiples of the same documents which makes me think there is still something I'm not doing exactly right...

这篇关于猫鼬QueryStream新结果的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆