如何只返回第一次出现的ID与猫鼬? [英] How to return only the first occurrence of an ID with Mongoose?

查看:118
本文介绍了如何只返回第一次出现的ID与猫鼬?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个名为Messages的大型收藏,就像这样:

I have a large collection called Messages, like so:

{
   user: 94fg844f,
   event: null,
   group: null,
   name: "Jake",
   text: "Hello world"
}, {
   user: fje93jrg4,
   event: null,
   group: null,
   name: "Bob"
   text: "Testing"
}, {
   user: fje93jrg4,
   event: null,
   group: null,
   name: "Bob"
   text: "Text here"
}, {
   user: null,
   event: d0j3n9fn3,
   group: null,
   name: "My Event"
   text: "Testing 2"
}, {
   user: null,
   event: d0j3n9fn3,
   group: null,
   name: "My Event"
   text: "Another text"
}

我需要首先出现用户,事件和组.

I need to get the first occurrence of the users, events and groups.

例如,由于用户fje93jrg4出现两次,所以我只想用文本Testing取回文档,因为文本Text here的文档比文本旧. d0j3n9fn3event也是如此.尽管我只想用Testing 2文本取回它的第一个文档,但是它发生了两次.

For example, since user fje93jrg4 occurs twice, I just want to get the document back with the text of Testing since the one with the text of Text here is older than it. Same goes with the event of d0j3n9fn3. It occurs twice, although I just want to get the first document back of it with the text of Testing 2.

我研究了distinct,尽管它似乎仅支持一个搜索字词,例如user,而不是usereventgroup.

I looked into distinct although it seems to only support one search term, like user instead of user, event and group.

上面的最终结果将是:

{
   user: 94fg844f,
   event: null,
   group: null,
   name: "Jake",
   text: "Hello world"
}, {
   user: fje93jrg4,
   event: null,
   group: null,
   name: "Bob"
   text: "Testing"
}, , {
   user: null,
   event: d0j3n9fn3,
   group: null,
   name: "My Event"
   text: "Testing 2"
}

我的猜测是,我可能必须将aggregate$first或类似的东西一起使用.进行3个不同查询的问题是我需要应用一个限制,以便总能获得10个结果.例如,混合中可能没有最近的group,只有eventuser.

My guess is that I'll probably have to use an aggregate with $first or something along those lines. The problem with doing 3 different queries is that I need to apply a limit so that I always get 10 results back. For example, there could be no recent groups in the mix, just events and users.

推荐答案

我们可以使用聚合框架来做到这一点.首先,我们需要 $sort 通过user和"_id ".然后,从那里开始,用户" $group 并使用 $last 累加器运算符返回每个用户的最后一个文档.请注意,如果我们使用累加器运算符,也可以使用 $first 累加器运算符按降序对文档进行排序,但按升序进行排序并使用$last可以使我们的意图明确.

We can use the aggregation framework to do this. First we need to $sort by user and "_id". From there, we then $group by "user" and use the $last accumulator operator to return the last document for each user. Note that we can also use the $first accumulator operator if we sort our documents in descending order, but sorting in ascending order and using $last make our intention clear.

db.collection.aggregate([
    { "$sort": { "user": 1, "_id": -1 } }, 
    { "$group": { 
        "_id": "$user", 
        "user": { "$last": "$$ROOT" } 
    }} 
])

产生:

{
    "_id" : "fje93jrg4",
    "user" : {
        "_id" : 2,
        "user" : "fje93jrg4",
        "event" : null,
        "group" : null,
        "name" : "Bob",
        "text" : "Testing"
    }
}
{
    "_id" : "94fg844f",
    "user" : {
        "_id" : 1,
        "user" : "94fg844f",
        "event" : null,
        "group" : null,
        "name" : "Jake",
        "text" : "Hello world"
    }
}
{
    "_id" : null,
    "user" : {
        "_id" : 4,
        "user" : null,
        "event" : "d0j3n9fn3",
        "group" : null,
        "name" : "My Event",
        "text" : "Testing 2"
    }
}

我们可能想在我们的管道中添加 $project 但是这样做会导致性能下降.但是,如果不需要返回的文档中的所有键/值对,它将减少通过网络发送的数据量以及用于解码客户端文档的时间和内存.

We may want to add a $project to our pipeline but doing so will cause a drop of performance. However it will reduce both the amount of data sent over the wire and the time and memory used to decode documents on the client side if do not need all of the key/value pairs in a document returned.

$project阶段如下所示:

{ "$project": {
    "_id": "$user._id",
    "user": "$user.user",
    "event": "$user.event",
    "group": "$user.group",
    "name": "$user.name",
    "text": "$user.text"
}}

这篇关于如何只返回第一次出现的ID与猫鼬?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆