使用NodeJS将多个文件上传到AWS S3 [英] Uploading multiple files to AWS S3 using NodeJS

查看:402
本文介绍了使用NodeJS将多个文件上传到AWS S3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用NodeJS将目录中的所有文件上传到S3存储桶.如果我明确为Key:字段提供文件路径+文字字符串,则可以一次上传一个文件.

I'm trying to upload all files within my directory to my S3 bucket using NodeJS. I'm able to upload one file at a time if I explicitly give the file path + literal string for the Key: field.

下面是我正在使用的脚本:

Below is the script I'm using:

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });

// reg ex to match
var re = /\.txt$/;

// ensure that this file is in the directory of the files you want to run the cronjob on

// ensure that this file is in the directory of the files you want to run the cronjob on
fs.readdir(".", function(err, files) {
    if (err) {
    console.log( "Could not list the directory.", err)
    process.exit( 1 )
    }


    var matches = files.filter( function(text) { return re.test(text) } )
    console.log("These are the files you have", matches)
    var numFiles = matches.length


    if ( numFiles ) {
        // Read in the file, convert it to base64, store to S3

        for( i = 0; i < numFiles; i++ ) {
            var fileName = matches[i]

            fs.readFile(fileName, function (err, data) {
                if (err) { throw err }

                // Buffer Pattern; how to handle buffers; straw, intake/outtake analogy
                var base64data = new Buffer(data, 'binary');


                var s3 = new AWS.S3()
                    s3.putObject({
                       'Bucket': 'noonebetterhaventakenthisbucketnname',
                        'Key': fileName,
                        'Body': base64data,
                        'ACL': 'public-read'
                     }, function (resp) {
                        console.log(arguments);
                        console.log('Successfully uploaded, ', fileName)
                    })
            })

        }

    }

})

对于尝试上传到S3的每个文件,都会产生此错误:

It produces this error for each file attempted to upload to S3:

These are the files you have [ 'test.txt', 'test2.txt' ]
{ '0': null,
  '1': { ETag: '"2cad20c19a8eb9bb11a9f76527aec9bc"' } }
Successfully uploaded,  test2.txt
{ '0': null,
  '1': { ETag: '"2cad20c19a8eb9bb11a9f76527aec9bc"' } }
Successfully uploaded,  test2.txt

编辑:使用变量名更新,以允许读取密钥而不是matches[i]

edit: updated using a variable name to allow key to be read instead of matches[i]

为什么只上传test2.txt,如何获取它来上传matches变量中的每个文件?

Why does it only upload test2.txt, and how do I get it to upload each file within my matches variable?

推荐答案

参考此

Referenced this Asynchronously reading and caching multiple files in nodejs to arrive at a solution.

tl; dr 范围问题-需要在闭包中包装变量;可以通过为readFiles3.putObject创建一个函数并在for循环中调用它来实现此目的.

tl;dr scope issue - need to wrap variables in closure; can do this by creating a function for the readFile and s3.putObject and calling that within the for loop.

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' });

var s3 = new AWS.S3()

function read(file) {
    fs.readFile(file, function (err, data) {
        if (err) { throw err }

        // Buffer Pattern; how to handle buffers; straw, intake/outtake analogy
        var base64data = new Buffer(data, 'binary');

        s3.putObject({
           'Bucket': 'noonebetterhaventakenthisbucketnname',
            'Key': file,
            'Body': base64data,
            'ACL': 'public-read'
         }, function (resp) {
            console.log(arguments);
            console.log('Successfully uploaded, ', file)
        })
    })
}

// reg ex to match
var re = /\.txt$/;

// ensure that this file is in the directory of the files you want to run the cronjob on
fs.readdir(".", function(err, files) {
    if (err) {
        console.log( "Could not list the directory.", err)
        process.exit( 1 )
    }

    var matches = files.filter( function(text) { return re.test(text) } )
    console.log("These are the files you have", matches)
    var numFiles = matches.length


    if ( numFiles ) {
        // Read in the file, convert it to base64, store to S3

        for( i = 0; i < numFiles; i++ ) {
            read(matches[i])
        }

    }

})

这篇关于使用NodeJS将多个文件上传到AWS S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆