MongoDB中按日期分组与本地时区 [英] Group by Date with Local Time Zone in MongoDB

查看:220
本文介绍了MongoDB中按日期分组与本地时区的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是mongodb的新手。以下是我的查询。

I am new to mongodb. Below is my query.

Model.aggregate()
            .match({ 'activationId': activationId, "t": { "$gte": new Date(fromTime), "$lt": new Date(toTime) } })
            .group({ '_id': { 'date': { $dateToString: { format: "%Y-%m-%d %H", date: "$datefield" } } }, uniqueCount: { $addToSet: "$mac" } })
            .project({ "date": 1, "month": 1, "hour": 1, uniqueMacCount: { $size: "$uniqueCount" } })
            .exec()
            .then(function (docs) {
                return docs;
            });

问题是mongodb在iso时区存储日期。我需要这些数据来显示面积图。

The issue is mongodb stores date in iso timezone. I need this data for displaying area chart.

我想按日期分组与当地时区。是否有任何方法可以在分组时将时间偏移添加到日期?

I want to group by date with local time zone. is there any way to add timeoffset into date when group by?

推荐答案

处理本地日期的一般问题



所以对此有一个简短的答案,也是一个很长的答案。基本情况是,而不是使用任何日期聚合运算符而你更愿意和需要实际上对数据做日期对象。这里主要的是通过给定本地时区的UTC偏移调整值,然后舍入到所需的时间间隔。

General Problem of Dealing with "local dates"

So there is a short answer to this and a long answer as well. The basic case is that instead of using any of the "date aggregation operators" you instead rather want to and "need to" actually "do the math" on the date objects instead. The primary thing here is to adjust the values by the offset from UTC for the given local timezone and then "round" to the required interval.

更长的答案并且还要考虑的主要问题涉及日期通常受到一年中不同时间与UTC的偏移的夏令时变化的影响。因此,这意味着当转换为本地时间以进行此类聚合时,您确实应该考虑这些更改的边界存在的位置。

The "much longer answer" and also the main problem to consider involves that dates are often subject to "Daylight Savings Time" changes in the offset from UTC at different times of the year. So this means that when converting to "local time" for such aggregation purposes, you really should consider where the boundaries for such changes exist.

还有另一个考虑因素,即无论你以什么方式聚合给定的间隔,输出值应该至少最初以UTC形式出现。这是一个很好的做法,因为显示到locale实际上是一个客户端功能,并且如后面所述,客户端接口通常会有一种在当前区域设置中显示的方式,该方式将基于它实际上已经被馈送的前提数据为UTC。

There is also another consideration, being that no matter what you do to "aggregate" at a given interval, the output values "should" at least initially come out as UTC. This is good practice since display to "locale" really is a "client function", and as later described, the client interfaces will commonly have a way of displaying in the present locale which will be based on the premise that it was in fact fed data as UTC.

这通常是需要解决的主要问题解决了。将日期舍入到区间的一般数学是简单的部分,但是没有真正的数学可以应用于知道这些边界何时适用,并且规则在每个区域都会发生变化,并且通常每年都会发生变化。

This is generally the main problem that needs to be solved. The general math for "rounding" a date to an interval is the simple part, but there is no real math you can apply to knowing when such boundaries apply, and the rules change in every locale and often every year.

所以这就是库的用武之地,作者对JavaScript平台的看法中最好的选择是时刻 - 时区,它基本上是 moment.js 的超集包括我们想要使用的所有重要的timezeone功能。

So this is where a "library" comes in, and the best option here in the authors opinion for a JavaScript platform is moment-timezone, which is basically a "superset" of moment.js including all the important "timezeone" features we want to use.

Moment Timezone基本上为每个区域设置时区定义了这样的结构:

Moment Timezone basically defines such a structure for each locale timezone as:

{
    name    : 'America/Los_Angeles',          // the unique identifier
    abbrs   : ['PDT', 'PST'],                 // the abbreviations
    untils  : [1414918800000, 1425808800000], // the timestamps in milliseconds
    offsets : [420, 480]                      // the offsets in minutes
}

当然,对象 <相对于更大 ,直到实际记录的抵消属性。但是,这是您需要访问的数据,以便查看在夏令时更改的情况下区域的偏移量是否实际发生了变化。

Where of course the objects are much larger with respect to the untils and offsets properties actually recorded. But that is the data you need to access in order to see if there is actually a change in the offset for a zone given daylight savings changes.

此块后期代码清单是我们基本上用来确定给定范围的开始结束的值,其中夏令时的边界是交叉,如果有的话:

This block of the later code listing is what we basically use to determine given a start and end value for a range, which daylight savings boundaries are crossed, if any:

  const zone = moment.tz.zone(locale);
  if ( zone.hasOwnProperty('untils') ) {
    let between = zone.untils.filter( u =>
      u >= start.valueOf() && u < end.valueOf()
    );
    if ( between.length > 0 )
      branches = between
        .map( d => moment.tz(d, locale) )
        .reduce((acc,curr,i,arr) =>
          acc.concat(
            ( i === 0 )
              ? [{ start, end: curr }] : [{ start: acc[i-1].end, end: curr }],
            ( i === arr.length-1 ) ? [{ start: curr, end }] : []
          )
        ,[]);
  }

查看2017年全年澳大利亚/悉尼区域设置的输出为:

Looking at the whole of 2017 for the Australia/Sydney locale the output of this would be:

[
  {
    "start": "2016-12-31T13:00:00.000Z",    // Interval is +11 hours here
    "end": "2017-04-01T16:00:00.000Z"
  },
  {
    "start": "2017-04-01T16:00:00.000Z",    // Changes to +10 hours here
    "end": "2017-09-30T16:00:00.000Z"
  },
  {
    "start": "2017-09-30T16:00:00.000Z",    // Changes back to +11 hours here
    "end": "2017-12-31T13:00:00.000Z"
  }
]

这基本上表明,在第一个日期序列之间,偏移量为+11小时,然后在第二个序列中的日期之间变为+10小时,然后切换回+11小时,覆盖到结束的间隔年度和指定范围。

Which basically reveals that between the first sequence of dates the offset would be +11 hours then changes to +10 hours between the dates in the second sequence and then switches back to +11 hours for the interval covering to the end of the year and the specified range.

然后需要将此逻辑转换为将成为的结构MongoDB将其理解为聚合管道的一部分。

This logic then needs to be translated into a structure that will be understood by MongoDB as part of an aggregation pipeline.

此处聚合到任何舍入日期间隔的数学原理基本上依赖于使用表示日期的毫秒值,舍入到最接近的数字,表示所需的间隔。

The mathematical principle here for aggregating to any "rounded date interval" essentially relies on using the milliseconds value of the represented date which is "rounded" down to the nearest number representing the "interval" required.

您基本上是通过查找模数或应用于所需间隔的当前值的余数。然后你从当前值中减去剩余部分,该值以最近的间隔返回一个值。

You essentially do this by finding the "modulo" or "remainder" of the current value applied to the required interval. Then you "subtract" that remainder from the current value which returns a value at the nearest interval.

例如,给定当前日期:

  var d = new Date("2017-07-14T01:28:34.931Z"); // toValue() is 1499995714931 millis
  // 1000 millseconds * 60 seconds * 60 minutes = 1 hour or 3600000 millis
  var v = d.valueOf() - ( d.valueOf() % ( 1000 * 60 * 60 ) );
  // v equals 1499994000000 millis or as a date
  new Date(1499994000000);
  ISODate("2017-07-14T01:00:00Z") 
  // which removed the 28 minutes and change to nearest 1 hour interval

这是我们还需要使用 $ subtract $ mod 操作,它们是用于上面显示的相同数学运算的聚合表达式。

This is the general math we also need to apply in the aggregation pipeline using the $subtract and $mod operations, which are the aggregation expressions used for the same math operations shown above.

汇总管道的一般结构是:

The general structure of the aggregation pipeline is then:

    let pipeline = [
      { "$match": {
        "createdAt": { "$gte": start.toDate(), "$lt": end.toDate() }
      }},
      { "$group": {
        "_id": {
          "$add": [
            { "$subtract": [
              { "$subtract": [
                { "$subtract": [ "$createdAt", new Date(0) ] },
                switchOffset(start,end,"$createdAt",false)
              ]},
              { "$mod": [
                { "$subtract": [
                  { "$subtract": [ "$createdAt", new Date(0) ] },
                  switchOffset(start,end,"$createdAt",false)
                ]},
                interval
              ]}
            ]},
            new Date(0)
          ]
        },
        "amount": { "$sum": "$amount" }
      }},
      { "$addFields": {
        "_id": {
          "$add": [
            "$_id", switchOffset(start,end,"$_id",true)
          ]
        }
      }},
      { "$sort": { "_id": 1 } }
    ];

您需要了解的主要部分是从 Date <转换存储在MongoDB中的对象数字表示内部时间戳值。我们需要数字形式,这样做是一个数学技巧,我们从另一个BSON日期中减去一个BSON日期,产生它们之间的数字差异。这正是本声明的作用:

The main parts here you need to understand is the conversion from a Date object as stored in MongoDB to Numeric representing the internal timestamp value. We need the "numeric" form, and to do this is a trick of math where we subtract one BSON Date from another which yields the numeric difference between them. This is exactly what this statement does:

{ "$subtract": [ "$createdAt", new Date(0) ] }

现在我们有一个数值来处理,我们可以应用模数并从中减去日期的数字表示,以圆它。因此,直接表示如下:

Now we have a numeric value to deal with, we can apply the modulo and subtract that from the numeric representation of the date in order to "round" it. So the "straight" representation of this is like:

{ "$subtract": [
  { "$subtract": [ "$createdAt", new Date(0) ] },
  { "$mod": [
    { "$subtract": [ "$createdAt", new Date(0) ] },
    ( 1000 * 60 * 60 * 24 ) // 24 hours
  ]}
]}

它反映了与前面所示相同的JavaScript数学方法,但应用于聚合管道中的实际文档值。您还将注意到我们应用 <$ c $的其他技巧。 c> $ add 使用BSON日期的另一种表示作为纪元(或0毫秒),其中BSON日期添加为数字值,返回 BSON日期表示作为输入给出的毫秒数。

Which mirrors the same JavaScript math approach as shown earlier but applied to the actual document values in the aggregation pipeline. You will also note the other "trick" there where we apply an $add operation with another representation of a BSON date as of epoch ( or 0 milliseconds ) where the "addition" of a BSON Date to a "numeric" value, returns a "BSON Date" representing the milliseconds it was given as input.

当然,在列出的代码中的另一个考虑因素是它与UTC的实际偏移量正在调整数值以确保目前时区的四舍五入。这是在一个函数中实现的,该函数基于前面的查找不同偏移发生位置的描述,并通过比较输入日期并返回正确的偏移量来返回聚合管道表达式中可用的格式。

Of course the other consideration in the listed code it the actual "offset" from UTC which is adjusting the numeric values in order to ensure the "rounding" takes place for the present timezone. This is implemented in a function based on the earlier description of finding where the different offsets occur, and returns a format as usable in an aggregation pipeline expression by comparing the input dates and returning the correct offset.

随着所有细节的全面扩展,包括处理那些不同的夏令时时间偏移的生成将如下:

With the full expansion of all the details, including the generation of handling those different "Daylight Savings" time offsets would then be like:

[
  {
    "$match": {
      "createdAt": {
        "$gte": "2016-12-31T13:00:00.000Z",
        "$lt": "2017-12-31T13:00:00.000Z"
      }
    }
  },
  {
    "$group": {
      "_id": {
        "$add": [
          {
            "$subtract": [
              {
                "$subtract": [
                  {
                    "$subtract": [
                      "$createdAt",
                      "1970-01-01T00:00:00.000Z"
                    ]
                  },
                  {
                    "$switch": {
                      "branches": [
                        {
                          "case": {
                            "$and": [
                              {
                                "$gte": [
                                  "$createdAt",
                                  "2016-12-31T13:00:00.000Z"
                                ]
                              },
                              {
                                "$lt": [
                                  "$createdAt",
                                  "2017-04-01T16:00:00.000Z"
                                ]
                              }
                            ]
                          },
                          "then": -39600000
                        },
                        {
                          "case": {
                            "$and": [
                              {
                                "$gte": [
                                  "$createdAt",
                                  "2017-04-01T16:00:00.000Z"
                                ]
                              },
                              {
                                "$lt": [
                                  "$createdAt",
                                  "2017-09-30T16:00:00.000Z"
                                ]
                              }
                            ]
                          },
                          "then": -36000000
                        },
                        {
                          "case": {
                            "$and": [
                              {
                                "$gte": [
                                  "$createdAt",
                                  "2017-09-30T16:00:00.000Z"
                                ]
                              },
                              {
                                "$lt": [
                                  "$createdAt",
                                  "2017-12-31T13:00:00.000Z"
                                ]
                              }
                            ]
                          },
                          "then": -39600000
                        }
                      ]
                    }
                  }
                ]
              },
              {
                "$mod": [
                  {
                    "$subtract": [
                      {
                        "$subtract": [
                          "$createdAt",
                          "1970-01-01T00:00:00.000Z"
                        ]
                      },
                      {
                        "$switch": {
                          "branches": [
                            {
                              "case": {
                                "$and": [
                                  {
                                    "$gte": [
                                      "$createdAt",
                                      "2016-12-31T13:00:00.000Z"
                                    ]
                                  },
                                  {
                                    "$lt": [
                                      "$createdAt",
                                      "2017-04-01T16:00:00.000Z"
                                    ]
                                  }
                                ]
                              },
                              "then": -39600000
                            },
                            {
                              "case": {
                                "$and": [
                                  {
                                    "$gte": [
                                      "$createdAt",
                                      "2017-04-01T16:00:00.000Z"
                                    ]
                                  },
                                  {
                                    "$lt": [
                                      "$createdAt",
                                      "2017-09-30T16:00:00.000Z"
                                    ]
                                  }
                                ]
                              },
                              "then": -36000000
                            },
                            {
                              "case": {
                                "$and": [
                                  {
                                    "$gte": [
                                      "$createdAt",
                                      "2017-09-30T16:00:00.000Z"
                                    ]
                                  },
                                  {
                                    "$lt": [
                                      "$createdAt",
                                      "2017-12-31T13:00:00.000Z"
                                    ]
                                  }
                                ]
                              },
                              "then": -39600000
                            }
                          ]
                        }
                      }
                    ]
                  },
                  86400000
                ]
              }
            ]
          },
          "1970-01-01T00:00:00.000Z"
        ]
      },
      "amount": {
        "$sum": "$amount"
      }
    }
  },
  {
    "$addFields": {
      "_id": {
        "$add": [
          "$_id",
          {
            "$switch": {
              "branches": [
                {
                  "case": {
                    "$and": [
                      {
                        "$gte": [
                          "$_id",
                          "2017-01-01T00:00:00.000Z"
                        ]
                      },
                      {
                        "$lt": [
                          "$_id",
                          "2017-04-02T03:00:00.000Z"
                        ]
                      }
                    ]
                  },
                  "then": -39600000
                },
                {
                  "case": {
                    "$and": [
                      {
                        "$gte": [
                          "$_id",
                          "2017-04-02T02:00:00.000Z"
                        ]
                      },
                      {
                        "$lt": [
                          "$_id",
                          "2017-10-01T02:00:00.000Z"
                        ]
                      }
                    ]
                  },
                  "then": -36000000
                },
                {
                  "case": {
                    "$and": [
                      {
                        "$gte": [
                          "$_id",
                          "2017-10-01T03:00:00.000Z"
                        ]
                      },
                      {
                        "$lt": [
                          "$_id",
                          "2018-01-01T00:00:00.000Z"
                        ]
                      }
                    ]
                  },
                  "then": -39600000
                }
              ]
            }
          }
        ]
      }
    }
  },
  {
    "$sort": {
      "_id": 1
    }
  }
]

该扩展正在使用 $ switch 语句,以便将日期范围作为何时返回给定偏移值的条件。这是最方便的形式,因为branches参数确实直接对应于数组,这是通过检查确定的范围最方便的输出。 until 表示查询提供的日期范围内给定时区的偏移切点。

That expansion is using the $switch statement in order to apply the date ranges as conditions to when to return the given offset values. This is the most convenient form since the "branches" argument does correspond directly to an "array", which is the most convenient output of the "ranges" determined by examination of the untils representing the offset "cut-points" for the given timezone on the supplied date range of the query.

可以使用 $ cond ,但实现起来有点麻烦,所以我们只是在这里使用最方便的方法。

It is possible to apply the same logic in earlier versions of MongoDB using a "nested" implementation of $cond instead, but it is a little messier to implement, so we are just using the most convenient method in implementation here.

一旦应用了所有这些条件,聚合日期实际上是表示由提供的区域设置。这实际上将我们带到了最终聚合阶段的原因,以及它在那里的原因以及在列表中所示的后续处理。

Once all of those conditions are applied, the dates "aggregated" are actually those representing the "local" time as defined by the supplied locale. This actually brings us to what the final aggregation stage is, and the reason why it is there as well as the later handling as demonstrated in the listing.

我之前提到的一般建议是输出仍然应该以UTC格式返回至少某些描述的日期值,因此这正是管道这里首先通过在舍入时应用偏移量将从UTC转换为局部,但随后将分组后的最终数字重新调整回适用于舍入日期值的相同偏移量。

I did mention earlier that the general recommendation is that the "output" should still return the date values in UTC format of at least some description, and therefore that is exactly what the pipeline here is doing by first converting "from" UTC to local by applying the offset when "rounding", but then the final numbers "after the grouping" are re-adjusted back by the same offset that applies to the "rounded" date values.

此处的列表在此处给出了三种不同的输出可能性:

The listing here gives "three" different output possibilities here as:

// ISO Format string from JSON stringify default
[
  {
    "_id": "2016-12-31T13:00:00.000Z",
    "amount": 2
  },
  {
    "_id": "2017-01-01T13:00:00.000Z",
    "amount": 1
  },
  {
    "_id": "2017-01-02T13:00:00.000Z",
    "amount": 2
  }
]
// Timestamp value - milliseconds from epoch UTC - least space!
[
  {
    "_id": 1483189200000,
    "amount": 2
  },
  {
    "_id": 1483275600000,
    "amount": 1
  },
  {
    "_id": 1483362000000,
    "amount": 2
  }
]

// Force locale format to string via moment .format()
[
  {
    "_id": "2017-01-01T00:00:00+11:00",
    "amount": 2
  },
  {
    "_id": "2017-01-02T00:00:00+11:00",
    "amount": 1
  },
  {
    "_id": "2017-01-03T00:00:00+11:00",
    "amount": 2
  }
]

这里需要注意的一点是像Angular这样的客户,这些格式中的每一种都会被它自己的 DatePipe 所接受。实际上可以为你做语言环境格式。但这取决于数据的提供位置。 好库将知道在当前语言环境中使用UTC日期。如果不是这种情况,那么你可能需要自己字符串化。

The one thing of note here is that for a "client" such as Angular, every single one of those formats would be accepted by it's own DatePipe which can actually do the "locale format" for you. But it depends on where the data is supplied to. "Good" libraries will be aware of using a UTC date in the present locale. Where that is not the case, then you might need to "stringify" yourself.

但这是一件简单的事情,你通过使用一个来获得最大的支持。这个库基本上是基于给定UTC值对输出的操纵。

But it is a simple thing, and you get the most support for this by using a library which essentially bases it's manipulation of output from a "given UTC value".

这里的主要内容是当你问这样的事情时理解你在做什么聚合到本地时区。这样的过程应该考虑:

The main thing here is to "understand what you are doing" when you ask such a thing as aggregating to a local time zone. Such a process should consider:


  1. 可以并且经常从不同时区内的人的角度查看数据。

  1. The data can be and often is viewed from the perspective of people within different timezones.

数据通常由不同时区的人提供。结合第1点,这就是我们以UTC格式存储的原因。

The data is generally provided by people in different timezones. Combined with point 1, this is why we store in UTC.

时区通常会受到夏令时变化的偏移的影响。世界时区,你应该在分析和处理数据时考虑到这一点。

Timezones are often subject to a changing "offset" from "Daylight Savings Time" in many of the world timezones, and you should account for that when analyzing and processing the data.

无论聚合间隔如何,输出应该实际上都保留在UTC中,尽管根据提供的区域设置调整为按间隔聚合。这使得演示文稿被委托给客户功能,就像它应该的那样。

Regardless of aggregation intervals, output "should" in fact remain in UTC, albeit adjusted to aggregate on interval according to the locale provided. This leaves presentation to be delegated to a "client" function, just as it should.

只要你保留那些事情和应用就像这里的列表所示,那么你正在做所有正确的事情来处理日期的集合,甚至是关于给定语言环境的一般存储。

As long as you keep those things in mind and apply just like the listing here demonstrates, then you are doing all the right things for dealing with aggregation of dates and even general storage with respect to a given locale.

所以你应该这样做,而你不应该做的就是放弃并简单地将语言环境日期存储为字符串。如上所述,这将是一种非常不正确的方法,只会对您的应用程序造成进一步的问题。

So you "should" be doing this, and what you "should not" be doing is giving up and simply storing the "locale date" as a string. As described, that would be a very incorrect approach and causes nothing but further problems for your application.


注意 :我在这里根本没有触及的一个主题是聚合到月(或实际上是年)间隔。 月是整个过程中的数学异常,因为天数总是变化的,因此需要一整套其他逻辑才能应用。单独描述这个帖子至少和这篇文章一样长,因此也是另一个主题。对于一般情况下的一般分钟,小时和天数,这里的数学对于这些情况来说足够好。

NOTE: The one topic I do not touch on here at all is aggregating to a "month" ( or indeed "year" ) interval. "Months" are the mathematical anomaly in the whole process since the number of days always varies and thus requires a whole other set of logic in order to apply. Describing that alone is at least as long as this post, and therefore would be another subject. For general minutes, hours, and days which is the common case, the math here is "good enough" for those cases.






完整列表



这可以作为修补的示范。它使用所需的函数来提取要包含的偏移日期和值,并在提供的数据上运行聚合管道。


Full Listing

This serves as a "demonstration" to tinker with. It employs the required function to extract the offset dates and values to be included and runs an aggregation pipeline over the supplied data.

您可以在此更改任何内容,但可能会从区域设置区间参数开始,然后可能添加不同的数据和不同的 start 结束查询的日期。但是其余的代码不需要更改为只更改任何这些值,因此可以演示使用不同的时间间隔(例如问题中提到的 1小时 )和不同的区域设置。

You can change anything in here, but will probably start with the locale and interval parameters, and then maybe add different data and different start and end dates for the query. But the rest of the code need not be changed to simply make changes to any of those values, and can therefore demonstrate using different intervals ( such as 1 hour as asked in the question ) and different locales.

例如,一旦提供实际需要以1小时间隔聚合的有效数据,则列表中的行将更改为:

For instance, once supplying valid data which would actually require aggregation at a "1 hour interval" then the line in the listing would be changed as:

const interval = moment.duration(1,'hour').asMilliseconds();

为了根据正在执行的聚合操作的要求定义聚合间隔的毫秒值日期。

In order to define a milliseconds value for the aggregation interval as required by the aggregation operations being performed on the dates.

const moment = require('moment-timezone'),
      mongoose = require('mongoose'),
      Schema = mongoose.Schema;

mongoose.Promise = global.Promise;
mongoose.set('debug',true);

const uri = 'mongodb://localhost/test',
      options = { useMongoClient: true };

const locale = 'Australia/Sydney';
const interval = moment.duration(1,'day').asMilliseconds();

const reportSchema = new Schema({
  createdAt: Date,
  amount: Number
});

const Report = mongoose.model('Report', reportSchema);

function log(data) {
  console.log(JSON.stringify(data,undefined,2))
}

function switchOffset(start,end,field,reverseOffset) {

  let branches = [{ start, end }]

  const zone = moment.tz.zone(locale);
  if ( zone.hasOwnProperty('untils') ) {
    let between = zone.untils.filter( u =>
      u >= start.valueOf() && u < end.valueOf()
    );
    if ( between.length > 0 )
      branches = between
        .map( d => moment.tz(d, locale) )
        .reduce((acc,curr,i,arr) =>
          acc.concat(
            ( i === 0 )
              ? [{ start, end: curr }] : [{ start: acc[i-1].end, end: curr }],
            ( i === arr.length-1 ) ? [{ start: curr, end }] : []
          )
        ,[]);
  }

  log(branches);

  branches = branches.map( d => ({
    case: {
      $and: [
        { $gte: [
          field,
          new Date(
            d.start.valueOf()
            + ((reverseOffset)
              ? moment.duration(d.start.utcOffset(),'minutes').asMilliseconds()
              : 0)
          )
        ]},
        { $lt: [
          field,
          new Date(
            d.end.valueOf()
            + ((reverseOffset)
              ? moment.duration(d.start.utcOffset(),'minutes').asMilliseconds()
              : 0)
          )
        ]}
      ]
    },
    then: -1 * moment.duration(d.start.utcOffset(),'minutes').asMilliseconds()
  }));

  return ({ $switch: { branches } });

}

(async function() {
  try {
    const conn = await mongoose.connect(uri,options);

    // Data cleanup
    await Promise.all(
      Object.keys(conn.models).map( m => conn.models[m].remove({}))
    );

    let inserted = await Report.insertMany([
      { createdAt: moment.tz("2017-01-01",locale), amount: 1 },
      { createdAt: moment.tz("2017-01-01",locale), amount: 1 },
      { createdAt: moment.tz("2017-01-02",locale), amount: 1 },
      { createdAt: moment.tz("2017-01-03",locale), amount: 1 },
      { createdAt: moment.tz("2017-01-03",locale), amount: 1 },
    ]);

    log(inserted);

    const start = moment.tz("2017-01-01", locale)
          end   = moment.tz("2018-01-01", locale)

    let pipeline = [
      { "$match": {
        "createdAt": { "$gte": start.toDate(), "$lt": end.toDate() }
      }},
      { "$group": {
        "_id": {
          "$add": [
            { "$subtract": [
              { "$subtract": [
                { "$subtract": [ "$createdAt", new Date(0) ] },
                switchOffset(start,end,"$createdAt",false)
              ]},
              { "$mod": [
                { "$subtract": [
                  { "$subtract": [ "$createdAt", new Date(0) ] },
                  switchOffset(start,end,"$createdAt",false)
                ]},
                interval
              ]}
            ]},
            new Date(0)
          ]
        },
        "amount": { "$sum": "$amount" }
      }},
      { "$addFields": {
        "_id": {
          "$add": [
            "$_id", switchOffset(start,end,"$_id",true)
          ]
        }
      }},
      { "$sort": { "_id": 1 } }
    ];

    log(pipeline);
    let results = await Report.aggregate(pipeline);

    // log raw Date objects, will stringify as UTC in JSON
    log(results);

    // I like to output timestamp values and let the client format
    results = results.map( d =>
      Object.assign(d, { _id: d._id.valueOf() })
    );
    log(results);

    // Or use moment to format the output for locale as a string
    results = results.map( d =>
      Object.assign(d, { _id: moment.tz(d._id, locale).format() } )
    );
    log(results);

  } catch(e) {
    console.error(e);
  } finally {
    mongoose.disconnect();
  }
})()

这篇关于MongoDB中按日期分组与本地时区的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆