插入node-mongodb-native的性能 [英] Insert performance of node-mongodb-native

查看:76
本文介绍了插入node-mongodb-native的性能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在用MongoDB测试Node.js的性能.我知道每一个都可以很好地相互独立,但是我正在尝试一些测试以使他们有一种感觉.我遇到了这个问题,无法确定来源.

I'm testing performance of Node.js with MongoDB. I know each of these is fine independent of the other, but I'm trying a handful of tests to get a feel for them. I ran across this issue and I'm having trouble determining the source.

问题

我正在尝试在单个Node.js程序中插入1,000,000条记录. 它绝对会爬网. 我们正在谈论20分钟的执行时间.不管是我的Mac还是CentOS,都会发生这种情况,尽管两者之间的行为略有不同.最终它确实完成了.

I'm trying to insert 1,000,000 records in a single Node.js program. It absolutely crawls. We're talking 20 minute execution time. This occurs whether it's my Mac or CentOS, although the behavior is marginally different between the two. It does eventually complete.

效果类似于交换,尽管不是(内存永远不会超过2 GB). MongoDB仅打开3个连接,并且大多数时候没有数据插入.它似乎在进行大量上下文切换,Node.js CPU内核已达到极限.

The effect is similar to swapping, although it's not (memory never exceeds 2 GB). There are only 3 connections open to MongoDB, and most of the time there's no data being inserted. It appears to be doing a lot of context switching, and the Node.js CPU core is maxed out.

效果类似于中提到的效果此线程.

我尝试使用PHP进行同样的操作,并且在2-3分钟内完成.没有戏.

I try the same using PHP and it finishes in 2-3 minutes. No drama.

为什么?

可能的原因

我目前认为这是Node.js套接字问题,libev在幕后发生的问题或其他node-mongodb-native问题.我可能完全错了,所以我在这里寻找一些指导.

I currently believe this is either a Node.js socket issue, something going on with libev behind the scenes, or some other node-mongodb-native issue. I may be totally wrong, so I'm looking for a little guidance here.

对于其他Node.js MongoDB适配器,我尝试过蒙古语,它似乎在排队文档以便批量插入它们,最终导致内存不足.这样就可以了. (旁注:我也不知道为什么要这样做,因为它甚至还没有接近我的16 GB的存储空间限制-但我没有为此做进一步的调查.)

As for other Node.js MongoDB adapters, I have tried Mongolian and it appears to queue documents in order to batch insert them, and it ends up running out of memory. So that's out. (Side note: I have no idea why on this, either, since it doesn't even come close to my 16 GB box limit--but I haven't bothered investigating much further on that.)

我可能应该提到,实际上我确实在4核计算机上测试了4个工人的主/工人群集,并在2-3分钟内完成了

I should probably mention that I did in fact test a master/worker cluster with 4 workers (on a quad-core machine) and it finished in 2-3 minutes.

代码

这是我的Node.js CoffeeScript程序:

Here's my Node.js CoffeeScript program:

mongodb = require "mongodb"
microtime = require "microtime"
crypto = require "crypto"

times = 1000000
server = new mongodb.Server "127.0.0.1", 27017
db = mongodb.Db "test", server
db.open (error, client) ->
  throw error if error?

  collection = mongodb.Collection client, "foo"

  for i in [0...times]
    console.log "Inserting #{i}..." if i % 100000 == 0

    hash = crypto.createHash "sha1"
    hash.update "" + microtime.now() + (Math.random() * 255 | 0)
    key = hash.digest "hex"

    doc =
      key: key,
      foo1: 1000,
      foo2: 1000,
      foo3: 1000,
      bar1: 2000,
      bar2: 2000,
      bar3: 2000,
      baz1: 3000,
      baz2: 3000,
      baz3: 3000

    collection.insert doc, safe: true, (error, response) ->
      console.log error.message if error

这是大致等效的PHP程序:

And here's the roughly equivalent PHP program:

<?php
$mongo = new Mongo();
$collection = $mongo->test->foo;

$times = 1000000;
for ($i = 0; $i < $times; $i++) {
    if ($i % 100000 == 0) {
        print "Inserting $i...\n";
    }

    $doc = array(
        "key" => sha1(microtime(true) + rand(0, 255)),
        "foo1" => 1000,
        "foo2" => 1000,
        "foo3" => 1000,
        "bar1" => 2000,
        "bar2" => 2000,
        "bar3" => 2000,
        "baz1" => 3000,
        "baz2" => 3000,
        "baz3" => 3000
    );
    try {
        $collection->insert($doc, array("safe" => true));
    } catch (MongoCursorException $e) {
        print $e->getMessage() . "\n";
    }
}

推荐答案

听起来您正在遇到V8中的默认堆限制.我写了关于取消此限制的博客帖子.

It sounds like you're running into the default heap limit in V8. I wrote a blog post about removing this limitation.

垃圾收集器可能会疯狂并咀嚼CPU,因为它会一直执行直到您达到1.4GB的限制.

The garbage collector is probably going crazy and chewing on CPU, since it will constantly execute until you're under the 1.4GB limit.

这篇关于插入node-mongodb-native的性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆