NodeJS,承诺和性能 [英] NodeJS, Promises and performance

查看:93
本文介绍了NodeJS,承诺和性能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的问题是有关NodeJS应用程序的性能...

如果我的程序运行12个迭代,每个迭代1.250.000 = 15.000.000个迭代,则需要花费以下时间在Amazon的专用服务器进行处理:

r3.large:2个vCPU,6.5 ECU,15 GB内存-> 123分钟

4.8倍大:36个vCPU,132个ECU,60 GB内存-> 102分钟

我对下面的代码有些类似...

start();

start(){

  for(var i=0; i<12; i++){

      function2();    // Iterates over a collection - which contains data split up in intervals - by date intervals. This function is actually also recursive - due to the fact - that is run through the data many time (MAX 50-100 times) - due to different intervals sizes...
    }
}

function2(){

  return new Promise{

    for(var i=0; i<1.250.000; i++){       
         return new Promise{      
            function3();      // This function simple iterate through all possible combinations - and call function3 - with all given values/combinations
         }
      }   
   } 
}


function3(){
   return new Promise{ // This function simple make some calculations based on the given values/combination - and then return the result to function2 - which in the end - decides which result/combination was the best...
}}

这等于0.411毫秒/441微秒的迭代次数!

当我在任务栏中查看性能和内存使用情况时... CPU并非以100%的速度运行-但更像是50%...整个时间? 内存使用率开始很低-但直到完成该过程后,每分钟KEEPS仍以GB为单位增长-但是,当我在Windows CMD中按CTRL + C时,首先释放了(已分配的)内存...所以它就像NodeJS垃圾回收一样不能达到最佳效果-或者可能只是简单地重新设计代码...

当我执行应用程序时,我会像以下那样使用内存选项:

node --max-old-space-size ="50000" server.js

请告诉我您可以做的每件事-使我的程序更快!

非常感谢大家!

解决方案

不是垃圾收集器不能最佳地工作,而是它根本不能工作 -您不会给它任何机会.

在开发 tco模块时节点中的.wikipedia.org/wiki/Tail_call"rel =" nofollow noreferrer>尾部调用优化发现了一件奇怪的事.它似乎泄漏内存,我不知道为什么.原来是因为console.log()很少 在我用于测试的各个地方进行调用以查看发生了什么,因为看到递归调用的结果达到了数百万个级别,所以花了一些时间,所以我想在执行过程中看到一些东西.

您的示例与此非常相似.

请记住,Node是单线程的.当您运行计算时,包括GC在内的其他所有操作都无法进行.您的代码是完全同步和阻塞的-即使它以阻塞的方式生成了数百万个承诺.之所以阻塞,是因为它从未到达事件循环.

请考虑以下示例:

var a = 0, b = 10000000;

function numbers() {
  while (a < b) {
    console.log("Number " + a++);
  }
}

numbers();

这非常简单-您想打印1000万个数字.但是,当您运行它时,它的行为非常奇怪-例如,它将数字打印到一定程度,然后停止几秒钟,然后继续运行,或者如果您使用交换功能,则可能开始破坏,或者可能会出现以下错误:看到数字8486后我就知道了:

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted

这里发生的是主线程在同步循环中被阻塞,在该循环中它继续创建对象,但是GC没有机会释放它们.

对于这样长时间运行的任务,您需要分工并偶尔进入事件循环.

以下是解决此问题的方法:

var a = 0, b = 10000000;

function numbers() {
  var i = 0;
  while (a < b && i++ < 100) {
    console.log("Number " + a++);
  }
  if (a < b) setImmediate(numbers);
}

numbers();

它的作用相同-它打印从ab的数字​​,但以100为一束,然后计划自己在事件循环结束时继续.

$(which time) -v node numbers1.js 2>&1 | egrep 'Maximum resident|FATAL'

的输出

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
    Maximum resident set size (kbytes): 1495968

它使用了1.5GB的内存并崩溃了.

$(which time) -v node numbers2.js 2>&1 | egrep 'Maximum resident|FATAL'

的输出

    Maximum resident set size (kbytes): 56404

它使用了56MB的内存并完成了操作.

另请参阅以下答案:

My question is about performance in my NodeJS app...

If my program run 12 iteration of 1.250.000 each = 15.000.000 iterations all together - it takes dedicated servers at Amazon the following time to process:

r3.large: 2 vCPU, 6.5 ECU, 15 GB memory --> 123 minutes

4.8xlarge: 36 vCPU, 132 ECU, 60 GB memory --> 102 minutes

I have some code similair to the code below...

start();

start(){

  for(var i=0; i<12; i++){

      function2();    // Iterates over a collection - which contains data split up in intervals - by date intervals. This function is actually also recursive - due to the fact - that is run through the data many time (MAX 50-100 times) - due to different intervals sizes...
    }
}

function2(){

  return new Promise{

    for(var i=0; i<1.250.000; i++){       
         return new Promise{      
            function3();      // This function simple iterate through all possible combinations - and call function3 - with all given values/combinations
         }
      }   
   } 
}


function3(){
   return new Promise{ // This function simple make some calculations based on the given values/combination - and then return the result to function2 - which in the end - decides which result/combination was the best...
}}

This is equal to 0.411 millisecond / 441 microseconds pér iteration!

When i look at performance and memory usage in the taskbar... the CPU is not running at 100% - but more like 50%...the entire time? The memory usage starts very low - but KEEPS growing in GB - every minute until the process is done - BUT the (allocated) memory is first released when i press CTRL+C in the Windows CMD... so its like the NodeJS garbage collection doesn't not work optimal - or may be its simple the design of the code again...

When i execute the app i use the memory opt like:

node --max-old-space-size="50000" server.js

PLEASE tell me every thing you thing i can do - to make my program FASTER!

Thank you all - so much!

解决方案

It's not that the garbage collector doesn't work optimally but that it doesn't work at all - you don't give it any chance to.

When developing the tco module that does tail call optimization in Node i noticed a strange thing. It seemed to leak memory and I didn't know why. It turned out that it was because of few console.log() calls in various places that I used for testing to see what's going on because seeing a result of recursive call millions levels deep took some time so I wanted to see something while it was doing it.

Your example is pretty similar to that.

Remember that Node is single-threaded. When your computations run, nothing else can - including the GC. Your code is completely synchronous and blocking - even though it's generating millions of promises in a blocking manner. It is blocking because it never reaches the event loop.

Consider this example:

var a = 0, b = 10000000;

function numbers() {
  while (a < b) {
    console.log("Number " + a++);
  }
}

numbers();

It's pretty simple - you want to print 10 million numbers. But when you run it it behaves very strangely - for example it prints numbers up to some point, and then it stops for several seconds, then it keeps going or maybe starts trashing if you're using swap, or maybe gives you this error that I just got right after seeing the Number 8486:

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted

What's going on here is that the main thread is blocked in a synchronous loop where it keeps creating objects but the GC has no chance to release them.

For such long running tasks you need to divide your work and get into the event loop once in a while.

Here is how you can fix this problem:

var a = 0, b = 10000000;

function numbers() {
  var i = 0;
  while (a < b && i++ < 100) {
    console.log("Number " + a++);
  }
  if (a < b) setImmediate(numbers);
}

numbers();

It does the same - it prints numbers from a to b but in bunches of 100 and then it schedules itself to continue at the end of the event loop.

Output of $(which time) -v node numbers1.js 2>&1 | egrep 'Maximum resident|FATAL'

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
    Maximum resident set size (kbytes): 1495968

It used 1.5GB of memory and crashed.

Output of $(which time) -v node numbers2.js 2>&1 | egrep 'Maximum resident|FATAL'

    Maximum resident set size (kbytes): 56404

It used 56MB of memory and finished.

See also those answers:

这篇关于NodeJS,承诺和性能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆