在写入文件之前完成承诺 [英] Promise completed before file is written

查看:73
本文介绍了在写入文件之前完成承诺的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对NodeJS非常陌生,并承诺,在Google搜索找到以下代码并将其变为使用fast-csv读取CSV文件的承诺后,基于packNr创建数据数组.以下代码按预期方式拆分了文件,但我的问题是将数组写回到新的CSV文件中.我需要以某种方式在下一个开始之前完成每个packNr的编写.使用下面的代码,我只会在代码完成之前获得要写入的每个文件的标题.

I am extremely new to NodeJS and promises, after a google search found the code below and made it into a promise to read a CSV file using fast-csv, create an array of the data based on the packNr. The below code is spliting the files as expected but my issue is writing the array back to a new CSV file. Somehow I need to finish writing each packNr before the next one begins. Using the code below I only get the header of each file to write before the code completes.

var readCSV = function(path, options) {
const datas = {}; // data['123'] = CSV data filtered for id = 123
return new Promise(function(resolve, reject) {

fastCsv
.parseFile(path, options)
.on('data', d => {
    if (!datas[d.packNr]) datas[d.packNr] = [];
    datas[d.packNr].push(d)        
})
.on('end', () => {

    Object.keys(data).forEach(packNr => {
        // For each ID, write a new CSV file
        fastCsv
        .write(datas[packNr],options)
        .pipe(fs.createWriteStream(`.\data-id-${packNr}.csv`))           
        })

  resolve();     
})
}) 

推荐答案

fastCsv 是流.流有自己的流(如果我可以说的话),它们继承自EventEmitters构造函数,因此它们是事件发射器.这意味着要将它们的进程包含到文件流中,您应该使用它们的事件.

fastCsv is a stream. Stream have their own flow (if i can say that), they inherit from the EventEmitters constructor so they are event emitters. That means to include their processes into your file flow you should use their events.

在关闭流连接时,它将发出一个 finish 和一个 close 事件.因此,您可以像这样

At the time a stream connection close it will emit a finish and a close event. So you can resolve your Promise using it, like so

.on('end', () => {

    Object.keys(data).forEach(packNr => {
        // For each ID, write a new CSV file
        fastCsv
        .write(datas[packNr],options)
        .pipe(fs.createWriteStream(`.\data-id-${packNr}.csv`))        
   })
    
    fastCsv.on('finish', () => resolve())
    // fastCsv.on('close', () => resolve()) that should work as well 
       
})

另一种解决方案,甚至更好的做法是使用流模块的管道方法来处理流连接结束的所有事件(结束,结束,关闭,错误)

Another solution, even better actualy as it will handle all events(end, finish, close, error) of stream's end of connection is to use the pipeline method from the stream module

   // on the top of your file
   const { pipeline } = require('stream')

     .on('end', () => {
    
        Object.keys(data).forEach(packNr => {
            // For each ID, write a new CSV file
            pipeline(
               fastCsv.write(datas[packNr],options),
               fs.createWriteStream(`.\data-id-${packNr}.csv`),
               e => {
                    if(e) reject(e)
                    else resolve() 
               }           
            )
        })
     })
   

=====(编辑)最终解决方案=======

===== (Edit) final solution =======

在我的逻辑中仍然存在问题,因为在每次迭代(forEach())时,它都会创建一个新的流.许诺将解决第一个结束流的末尾,而不是最后一个结束的末尾.要修复它,您可以这样做

Still remain a problem in my logic because on each iteration (forEach()) it creates a new stream. The promise will resolve a the end of the first ending stream but not at the end of the last one. To fix it you can do that

       const { pipeline } = require('stream')
    
         .on('end', () => {
        
            const dataObj = Object.keys(data)
            const lg = dataObj.length
            let i = 0
            dataObj.forEach(packNr => {
                // For each ID, write a new CSV file
                pipeline(
                   fastCsv.write(datas[packNr],options),
                   fs.createWriteStream(`.\data-id-${packNr}.csv`),
                   e => {
                        if(e) reject(e)
                        i++
                        if(i === lg) resolve()
                   }           
                )
            })
         })

这篇关于在写入文件之前完成承诺的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆