nodejs响应速度和nginx [英] nodejs response speed and nginx

查看:27
本文介绍了nodejs响应速度和nginx的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

刚开始测试 nodejs,想在理解以下行为方面得到一些帮助:

示例 1:

var http = require('http');http.createServer(function(req, res){res.writeHeader(200, {'Content-Type': 'text/plain'});res.end('foo');}).listen(1001, '0.0.0.0');

示例 2:

var http = require('http');http.createServer(function(req, res){res.writeHeader(200, {'Content-Type': 'text/plain'});res.write('foo');res.end('bar');}).listen(1001, '0.0.0.0');

在 Chrome 中测试响应时间时:

<块引用>

示例 #1 - 6-10 毫秒
示例 #2 - 200-220 毫秒

但是,如果通过 nginx proxy_pass 测试两个示例

服务器{听1011;地点/{proxy_pass http://127.0.0.1:1001;}}

我明白了:

<块引用>

示例 #1 - 4-8ms
示例 #2 - 4-8ms

我不是 nodejs 或 nginx 的专家,请问有人可以解释一下吗?

nodejs - v.0.8.1
nginx - v.1.2.2

更新:

感谢 Hippo,我在有和没有 nginx 的服务器上用 ab 进行了测试,并得到了相反的结果.

还添加到 nginx 配置 proxy_cache off

服务器{听1011;地点/{proxy_pass http://127.0.0.1:1001;proxy_cache 关闭;}}

示例 #1 直接:

<块引用>

ab -n 1000 -c 50 http://127.0.0.1:1001/

<前>服务器软件:服务器主机名:127.0.0.1服务器端口:1001文件路径:/文件长度:65 字节并发级别:50测试时间:1.018 秒完成请求:1000失败的请求:0写入错误:0总传输量:166000 字节传输的 HTML:65000 字节每秒请求数:981.96 [#/sec](平均值)每个请求的时间:50.919 [ms](平均值)每个请求的时间:1.018 [ms](平均,跨所有并发请求)传输速率:159.18 [Kbytes/sec] 接收连接时间(毫秒)最小平均值[+/-sd] 中值最大值连接:0 0 0.6 0 3处理:0 50 44.9 19 183等待:0 49 44.8 17 183总计:1 50 44.7 19 183

示例 #1 nginx:

<块引用>

ab -n 1000 -c 50 http://127.0.0.1:1011/

<前>服务器软件:nginx/1.2.2服务器主机名:127.0.0.1服务器端口:1011文件路径:/文件长度:65 字节并发级别:50测试时间:1.609 秒完成请求:1000失败的请求:0写入错误:0总传输量:187000 字节传输的 HTML:65000 字节每秒请求数:621.40 [#/sec](平均值)每个请求的时间:80.463 [ms](平均值)每个请求的时间:1.609 [ms](平均,跨所有并发请求)传输速率:113.48 [Kbytes/sec] 接收连接时间(毫秒)最小平均值[+/-sd] 中值最大值连接:0 0 0.6 0 3处理:2 77 44.9 96 288等待:2 77 44.8 96 288总计:3 78 44.7 96 288

示例 #2 直接:

<块引用>

ab -n 1000 -c 50 http://127.0.0.1:1001/

<前>服务器软件:服务器主机名:127.0.0.1服务器端口:1001文件路径:/文件长度:76 字节并发级别:50测试时间:1.257 秒完成请求:1000失败的请求:0写入错误:0总传输量:177000 字节传输的 HTML:76000 字节每秒请求数:795.47 [#/sec](平均值)每个请求的时间:62.856 [ms](平均值)每个请求的时间:1.257 [ms](平均,跨所有并发请求)传输速率:137.50 [Kbytes/sec] 接收连接时间(毫秒)最小平均值[+/-sd] 中值最大值连接:0 0 0.3 0 2处理:0 60 47.8 88 193等待:0 60 47.8 87 193总计:0 61 47.7 88 193

示例 #2 nginx:

<块引用>

ab -n 1000 -c 50 http://127.0.0.1:1011/

<前>服务器软件:nginx/1.2.2服务器主机名:127.0.0.1服务器端口:1011文件路径:/文件长度:76 字节并发级别:50测试时间:1.754 秒完成请求:1000失败的请求:0写入错误:0总传输量:198000 字节传输的 HTML:76000 字节每秒请求数:570.03 [#/sec](平均值)每个请求的时间:87.715 [ms](平均值)每个请求的时间:1.754 [ms](平均,跨所有并发请求)传输速率:110.22 [Kbytes/sec] 接收连接时间(毫秒)最小平均值[+/-sd] 中值最大值连接:0 0 0.4 0 2处理:1 87 42.1 98 222等待:1 86 42.3 98 222总计:1 87 42.0 98 222


现在结果看起来更有逻辑了,但是在调用res.write()

时还是有奇怪的延迟

我想这是(看起来确实)一个愚蠢的问题,但是我仍然在使用此服务器配置(Centos 6)和此具体服务器(vps)的浏览器中的响应时间上存在巨大差异.

在我的家用电脑 (Ubuntu 12) 上,但使用 localhost 的旧版本测试一切正常.


解决方案

查看 http.js 会发现案例 #1 在 nodejs 本身中有特殊处理,我猜是某种快捷方式优化.

var hot = this._headerSent === false &&typeof(data) === '字符串' &&数据长度>0 &&this.output.length === 0 &&this.connection &&this.connection.writable &&this.connection._httpMessage === this;如果(热){//热路径.他们在做//res.writeHead();//res.end(等等);//哈奇.如果(this.chunkedEncoding){var l = Buffer.byteLength(data, encoding).toString(16);ret = this.connection.write(this._header + l + CRLF +数据 + '
0
' +this._trailer + '
', 编码);} 别的 {ret = this.connection.write(this._header + data, encoding);}this._headerSent = true;}否则如果(数据){//正常的正文写入.ret = this.write(data, encoding);}如果(!热){如果(this.chunkedEncoding){ret = this._send('0
' + this._trailer + '
');//最后一块.} 别的 {//强制刷新,HACK.ret = this._send('');}}this.finished = true;

Just started testing nodejs, and wanted to get some help in understanding following behavior:

Example #1:

var http = require('http');
http.createServer(function(req, res){
    res.writeHeader(200, {'Content-Type': 'text/plain'});
    res.end('foo');
}).listen(1001, '0.0.0.0');

Example #2:

var http = require('http');
http.createServer(function(req, res){
    res.writeHeader(200, {'Content-Type': 'text/plain'});
    res.write('foo');
    res.end('bar');
}).listen(1001, '0.0.0.0');

When testing response time in Chrome:

example #1 - 6-10ms
example #2 - 200-220ms

But, if test both examples through nginx proxy_pass

server{
    listen 1011;
    location / {
        proxy_pass http://127.0.0.1:1001;
    }
}

i get this:

example #1 - 4-8ms
example #2 - 4-8ms

I am not an expert on either nodejs or nginx, and asking if someone can explain this?

nodejs - v.0.8.1
nginx - v.1.2.2

update:

thanks to Hippo, i made test with ab on my server with and without nginx, and got opposite results.

also added to nginx config proxy_cache off

server{
    listen 1011;
    location / {
        proxy_pass http://127.0.0.1:1001;
        proxy_cache off;
    }
}

example #1 direct:

ab -n 1000 -c 50 http:// 127.0.0.1:1001/


    Server Software:        
    Server Hostname:        127.0.0.1
    Server Port:            1001

    Document Path:          /
    Document Length:        65 bytes

    Concurrency Level:      50
    Time taken for tests:   1.018 seconds
    Complete requests:      1000
    Failed requests:        0
    Write errors:           0
    Total transferred:      166000 bytes
    HTML transferred:       65000 bytes
    Requests per second:    981.96 [#/sec] (mean)
    Time per request:       50.919 [ms] (mean)
    Time per request:       1.018 [ms] (mean, across all concurrent requests)
    Transfer rate:          159.18 [Kbytes/sec] received

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        0    0   0.6      0       3
    Processing:     0   50  44.9     19     183
    Waiting:        0   49  44.8     17     183
    Total:          1   50  44.7     19     183

example #1 nginx:

ab -n 1000 -c 50 http:// 127.0.0.1:1011/


    Server Software:        nginx/1.2.2
    Server Hostname:        127.0.0.1
    Server Port:            1011

    Document Path:          /
    Document Length:        65 bytes

    Concurrency Level:      50
    Time taken for tests:   1.609 seconds
    Complete requests:      1000
    Failed requests:        0
    Write errors:           0
    Total transferred:      187000 bytes
    HTML transferred:       65000 bytes
    Requests per second:    621.40 [#/sec] (mean)
    Time per request:       80.463 [ms] (mean)
    Time per request:       1.609 [ms] (mean, across all concurrent requests)
    Transfer rate:          113.48 [Kbytes/sec] received

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        0    0   0.6      0       3
    Processing:     2   77  44.9     96     288
    Waiting:        2   77  44.8     96     288
    Total:          3   78  44.7     96     288

example #2 direct:

ab -n 1000 -c 50 http:// 127.0.0.1:1001/


    Server Software:        
    Server Hostname:        127.0.0.1
    Server Port:            1001

    Document Path:          /
    Document Length:        76 bytes

    Concurrency Level:      50
    Time taken for tests:   1.257 seconds
    Complete requests:      1000
    Failed requests:        0
    Write errors:           0
    Total transferred:      177000 bytes
    HTML transferred:       76000 bytes
    Requests per second:    795.47 [#/sec] (mean)
    Time per request:       62.856 [ms] (mean)
    Time per request:       1.257 [ms] (mean, across all concurrent requests)
    Transfer rate:          137.50 [Kbytes/sec] received

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        0    0   0.3      0       2
    Processing:     0   60  47.8     88     193
    Waiting:        0   60  47.8     87     193
    Total:          0   61  47.7     88     193

example #2 nginx:

ab -n 1000 -c 50 http:// 127.0.0.1:1011/


    Server Software:        nginx/1.2.2
    Server Hostname:        127.0.0.1
    Server Port:            1011

    Document Path:          /
    Document Length:        76 bytes

    Concurrency Level:      50
    Time taken for tests:   1.754 seconds
    Complete requests:      1000
    Failed requests:        0
    Write errors:           0
    Total transferred:      198000 bytes
    HTML transferred:       76000 bytes
    Requests per second:    570.03 [#/sec] (mean)
    Time per request:       87.715 [ms] (mean)
    Time per request:       1.754 [ms] (mean, across all concurrent requests)
    Transfer rate:          110.22 [Kbytes/sec] received

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        0    0   0.4      0       2
    Processing:     1   87  42.1     98     222
    Waiting:        1   86  42.3     98     222
    Total:          1   87  42.0     98     222


Now results looks more logic, but still there is a strange delay when calling res.write()

I guess it was (sure looks like) a stupid question, but i still get huge difference in response time in browser with this server configuration (Centos 6) and this concrete server (vps).

On my home computer (Ubuntu 12) but with older versions testing from localhost everything works fine.


解决方案

Peeking into http.js reveals that case #1 has special handling in nodejs itself, some kind of a shortcut optimization I guess.

var hot = this._headerSent === false &&
            typeof(data) === 'string' &&
            data.length > 0 &&
            this.output.length === 0 &&
            this.connection &&
            this.connection.writable &&
            this.connection._httpMessage === this;

      if (hot) {
        // Hot path. They're doing
        //   res.writeHead();
        //   res.end(blah);
        // HACKY.

        if (this.chunkedEncoding) {
          var l = Buffer.byteLength(data, encoding).toString(16);
          ret = this.connection.write(this._header + l + CRLF +
                                      data + '
0
' +
                                      this._trailer + '
', encoding);
        } else {
          ret = this.connection.write(this._header + data, encoding);
        }
        this._headerSent = true;

      } else if (data) {
        // Normal body write.
        ret = this.write(data, encoding);
      }

      if (!hot) {
        if (this.chunkedEncoding) {
          ret = this._send('0
' + this._trailer + '
'); // Last chunk.
        } else {
          // Force a flush, HACK.
          ret = this._send('');
        }
      }

      this.finished = true;

这篇关于nodejs响应速度和nginx的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆