在node.js中一次遍历一个包含50个项目的数组的数组 [英] Iterate through an array in blocks of 50 items at a time in node.js

查看:173
本文介绍了在node.js中一次遍历一个包含50个项目的数组的数组的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是node.js的新手,目前我正在尝试编写数组迭代。我有一个包含1000个项目的数组 - 由于服务器负载问题,我希望一次以50个项目的方式迭代。

I'm new to node.js and am currently trying to code array iterations. I have an array of 1,000 items - which I'd like to iterate through in blocks of 50 items at a time due to problems with server load.

我目前使用的是forEach循环如下所示(我希望转换为前面提到的块迭代)

I currently use a forEach loop as seen below (which I'm looking at hopefully transforming into the aforementioned block iteration)

   //result is the array of 1000 items

   result.forEach(function (item) {
     //Do some data parsing
     //And upload data to server
    });

我们非常感谢任何帮助!

Any help would be much appreciated!

更新(回复回复)

async function uploadData(dataArray) {
    try {
        const chunks = chunkArray(dataArray, 50);
        for (const chunk of chunks) {
            await uploadDataChunk(chunk);
        }
    } catch (error) {
        console.log(error)
        // Catch en error here
    }
}

function uploadDataChunk(chunk) {
    return Promise.all(
        chunk.map((item) => {
            return new Promise((resolve, reject) => {
               //upload code
                }
            })
        })
    )
}


推荐答案

首先应该将数组拆分为50个块。然后你需要逐个发出请求,而不是一次。承诺可以用于此目的。

You should firstly split your array to chunks of 50. Then you need to make requests one by one, not at once. promises can be used for this purpose.

考虑这个实现:

function parseData() { } // returns an array of 1000 items

async function uploadData(dataArray) {
  try {
    const chunks = chunkArray(dataArray, 50);
    for(const chunk of chunks) {
      await uploadDataChunk(chunk);
    }
  } catch(error) {
    // Catch en error here
  }
}

function uploadDataChunk(chunk) {
  // return a promise of chunk uploading result
}

const dataArray = parseData();
uploadData(dataArray);

使用async / await将使用承诺,以便等待将等到当前块上传,然后才会上传下一个(如果没有错误发生)。

Using async/await will use promises under the hood, so that await will wait till current chunk is uploaded and only then will upload next one (if no error occurred).

这是我的chunkArray函数提议实现:

And here is my proposal of chunkArray function implementation:

function chunkArray(array, chunkSize) {
  return Array.from(
    { length: Math.ceil(array.length / chunkSize) },
    (_, index) => array.slice(index * chunkSize, (index + 1) * chunkSize)   
  );
}






注意:此代码使用ES6的功能,所以最好使用babel / TypeScript。


Note: this code uses ES6 features, so it it desirable to use babel / TypeScript.

如果创建多个异步数据库连接,只需使用一些数据库池工具。

If you create multiple asynchronous database connections, just use some database pooling tool.

如果要异步更新所有块,当上传块开始上传另一个时,你可以这样做:

If you want to update all the chunk asynchronously, and when chunk is uploaded start to upload another one, you can do it this way:

function uploadDataChunk(chunk) {
  return Promise.all(
    chunk.map(uploadItemToGoogleCloud) // uploadItemToGoogleCloud should return a promise
  );
}

这篇关于在node.js中一次遍历一个包含50个项目的数组的数组的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆