node.js如何通过背压处理快速生产者和缓慢消耗者 [英] node.js how to handle fast producer and slow consumer with backpressure

查看:85
本文介绍了node.js如何通过背压处理快速生产者和缓慢消耗者的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是node.js的新手,不了解有关流的文档.希望得到一些提示.

I'm very novice in node.js and don't understand the documentation about streams. Hoping to get some tips.

我正在读取一个非常大的文件行,然后针对每一行调用一个异步网络api.

I'm reading a very large file line, and then for each line I'm calling an async network api.

很明显,本地文件的读取速度比异步调用完成的速度快得多:

Obviously the local file is read much faster than the async calls are completed:

var lineReader = require('readline').createInterface({
  input: require('fs').createReadStream(program.input)
});

lineReader.on('line', function (line) {
    client.execute(query, [line], function(err, result) {
        // needs to pressure the line reader here
        var myJSON = JSON.stringify(result);
        console.log("line=%s json=%s",myJSON);
    });
});

在执行"方法中增加背压的方法是什么?

What is the way to add back pressure in the "execute" method?

推荐答案

解决方案是将异步行为包装在流编写器中,并从编写器内部限制异步阅读器:

The solution is to wrap the async behavior in a stream writer and throttle the async reader from within the writer:

val count = 0;
var writable = new stream.Writable({
    write: function (line, encoding, next) {
        count++;
        if (count < concurrent) {
            next();
        }

        asyncFunctionToCall(...) {
            // completion callback
            // reduce the count and release back pressure
            count--;
            next();
            ...
      }
});

var stream = fs.createReadStream(program.input, {encoding: 'utf8'});
stream = byline.createStream(stream);
stream.pipe(writable);

这篇关于node.js如何通过背压处理快速生产者和缓慢消耗者的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆