MongoDB变更流严重降低了性能 [英] Severe performance drop with MongoDB Change Streams

查看:359
本文介绍了MongoDB变更流严重降低了性能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想获得有关Node.js中MongoDB数据库更改的实时更新.

I want to get real-time updates about MongoDB database changes in Node.js.

单个MongoDB更改流几乎立即发送更新通知.但是,当我打开多个(10+)流时,在数据库写入和通知到达之间会存在巨大的延迟(长达数分钟).

A single MongoDB change stream sends update notifications almost instantly. But when I open multiple (10+) streams, there are massive delays (up to several minutes) between database writes and notification arrival.

这就是我设置变更流的方式:

That's how I set up a change stream:

let cursor = collection.watch([
  {$match: {"fullDocument.room": roomId}},
]);
cursor.stream().on("data", doc => {...});

我尝试了另一种设置流的方法,但速度却很慢:

I tried an alternative way to set up a stream, but it's just as slow:

let cursor = collection.aggregate([
  {$changeStream: {}},
  {$match: {"fullDocument.room": roomId}},
]);
cursor.forEach(doc => {...});

一个自动化过程会在收集性能数据的同时将微小的文档插入到集合中.

An automated process inserts tiny documents into the collection while collecting performance data.

一些其他详细信息:

  • 打开流游标的数量:50
  • 写入速度:100 docs/second(使用insertMany每10批)
  • 运行时间:100秒
  • 平均延迟:7.1秒
  • 最大延迟:205秒(不是错字,超过三分钟)
  • MongoDB版本:3.6.2
  • 集群设置1:MongoDB Atlas M10(3个副本集)
  • 集群设置2:Docker中的DigitalOcean Ubuntu框+单实例mongo集群
  • Node.js CPU使用率:< 1%
  • Open stream cursors count: 50
  • Write speed: 100 docs/second (batches of 10 using insertMany)
  • Runtime: 100 seconds
  • Average delay: 7.1 seconds
  • Largest delay: 205 seconds (not a typo, over three minutes)
  • MongoDB version: 3.6.2
  • Cluster setup #1: MongoDB Atlas M10 (3 replica set)
  • Cluster setup #2: DigitalOcean Ubuntu box + single instance mongo cluster in Docker
  • Node.js CPU usage: <1%

两种设置都会产生相同的问题.这可能是怎么回事?

Both setups produce the same issue. What could be going on here?

推荐答案

默认连接池大小为5.由于每个更改流游标会打开一个新连接,因此连接池至少应为与游标数量一样大.

The default connection pool size in the Node.js client for MongoDB is 5. Since each change stream cursor opens a new connection, the connection pool needs to be at least as large as the number of cursors.

const mongoConnection = await MongoClient.connect(URL, {poolSize: 100});

(感谢MongoDB Inc.调查此问题.)

(Thanks to MongoDB Inc. for investigating this issue.)

这篇关于MongoDB变更流严重降低了性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆