当我想要更多/全部时,猫鼬将查询限制为1000个结果(从2.6.5迁移到3.1.2) [英] Mongoose limiting query to 1000 results when I want more/all (migrating from 2.6.5 to 3.1.2)

查看:59
本文介绍了当我想要更多/全部时,猫鼬将查询限制为1000个结果(从2.6.5迁移到3.1.2)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在将我的应用程序从Mongoose 2.6.5迁移到3.1.2,并且遇到了一些意外行为.即,我注意到查询结果自动被限制为1000条记录,而几乎所有其他功能都相同.在下面的代码中,我设置了一个值maxIvDataPoints,该值限制了返回(并最终发送到客户端浏览器)的数据点的数量,并且该值在其他位置设置为1500.我使用一个计数查询来确定总数可能的结果,然后使用下一个mod来限制实际查询结果,该计数使用计数和maxIvDataPoints的值确定mod的值.我正在运行节点0.8.4和mongo 2.0.4,并在coffeescript中编写服务器端代码.

I'm migrating my app from Mongoose 2.6.5 to 3.1.2, and I'm running into some unexpected behavior. Namely I notice that query results are automatically being limited to 1000 records, while pretty much everything else works the same. In my code (below) I set a value maxIvDataPoints that limits the number of data points returned (and ultimately sent to the client browser), and that value was set elsewhere to 1500. I use a count query to determine the total number of potential results, and then a subsequent mod to limit the actual query results using the count and the value of maxIvDataPoints to determine the value of the mod. I'm running node 0.8.4 and mongo 2.0.4, writing server-side code in coffeescript.

在安装mongoose 3.1.x之前,该代码可以按我的要求工作,每次返回的数据点都少于1500.在安装3.1.2之后,我每次都会准确返回1000个数据点(假设在指定范围内有1000个以上的数据点).结果被截断,因此数据点1001至〜1500不再返回.

Prior to installing mongoose 3.1.x the code was working as I had wanted, returning just under 1500 data points each time. After installing 3.1.2 I'm getting exactly 1000 data points returned each time (assuming there are more than 1000 data points in the specified range). The results are truncated, so that data points 1001 to ~1500 are the ones no longer being returned.

似乎在某些地方可以控制这种行为,但是我在文档中,此处或Google网上找不到任何内容.我还是n00b的亲戚,所以我可能错过了一些明显的事情.

It seems there may be some setting somewhere that governs this behavior, but I can't find anything in the docs, on here, or in the Google group. I'm still a relative n00b so I may have missed something obvious.

DataManager::ivDataQueryStream = (testId, minTime, maxTime, callback) ->

    # If minTime and maxTime have been provided, set a flag to limit time extents of query
    unless isNaN(minTime)
    timeLimits = true

    # Load the max number of IV data points to be displayed from CONFIG
    maxIvDataPoints = CONFIG.maxIvDataPoints

    # Construct a count query to determine the number if IV data points in range
    ivCountQuery = TestDataPoint.count({})
    ivCountQuery.where "testId", testId

    if timeLimits
        ivCountQuery.gt "testTime", minTime
        ivCountQuery.lt "testTime", maxTime

    ivCountQuery.exec (err, count) ->

        ivDisplayQuery = TestDataPoint.find({})
        ivDisplayQuery.where "testId", testId

        if timeLimits
            ivDisplayQuery.gt "testTime", minTime
            ivDisplayQuery.lt "testTime", maxTime

        # If the data set is too large, use modulo to sample, keeping the total data series
        # for display below maxIvDataPoints
        if count > maxIvDataPoints
            dataMod = Math.ceil count/maxIvDataPoints

            ivDisplayQuery.mod "dataPoint", dataMod, 1

        ivDisplayQuery.sort "dataPoint" #, 1 <-- new sort syntax for Mongoose 3.x
        callback ivDisplayQuery.stream()

推荐答案

您会被一对相关因素绊倒:

You're getting tripped up by a pair of related factors:

  1. 猫鼬的默认查询batchSize 在3.1.2中更改为1000. .
  2. MongoDB有一个已知问题,其中需要进行内存排序的查询放在其中查询的批处理大小对返回的文档数有严格限制.
  1. Mongoose's default query batchSize changed to 1000 in 3.1.2.
  2. MongoDB has a known issue where a query that requires an in-memory sort puts a hard limit of the query's batch size on the number of documents returned.

因此,您的选择是在TestDataPoint上放置一个组合索引,该索引将允许mongo在此类查询中将其用于按dataPoint进行排序或增加

So your options are to put a combo index on TestDataPoint that would allow mongo to use it for sorting by dataPoint in this type of query or increase the batch size to at least the total count of documents you're expecting.

这篇关于当我想要更多/全部时,猫鼬将查询限制为1000个结果(从2.6.5迁移到3.1.2)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆