将文件直接从Node js req主体加载到S3 [英] Loading File directly from Node js req body to S3

查看:128
本文介绍了将文件直接从Node js req主体加载到S3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Node.js Amazon Web Service sdk将文件上传到s3,我尝试直接从请求正文中选择文件以上传到Amazon.但是,我一直得到TypeError: buf.copy is not a function.下面是我的代码:

I am trying to upload files to s3 using Node.js Amazon Web Service sdk and I try to pick file directly from the request body to upload on Amazon. But, I keep getting a TypeError: buf.copy is not a function. Below is my code:

 create: function(req, res) {
    var imageFile = req.file('imageFile');
    var fileName = Math.floor(Date.now() / 1000);
    var key = settings.aws.Key;
    var secret = settings.aws.Secret;
    var bucket = settings.aws.Bucket;
    AWS.config.update({accessKeyId: key, secretAccessKey: secret});
    var parames = {Bucket: bucket, Key: fileName, Body: imageFile};
    var s3Obj = new AWS.S3();
    s3Obj.upload(parames).
    on('httpUploadProgress', function(evt) {console.log("In "+evt.loaded); }).
    send(function(err, data){
        if (err) {
            return ValidationService.jsonResolveError(err, Inventory, res);
        }
        console.log(data);
        res.json({status:200, file: data});
    })
}

以及我不断收到的错误的更详细的堆栈跟踪:

And a more detailed stack trace of the error I keep getting:

buffer.js:237 buf.copy(buffer,pos); ^

buffer.js:237 buf.copy(buffer, pos); ^

TypeError:buf.copy不是函数 在Function.Buffer.concat(buffer.js:237:9) 在ManagedUpload.fillStream(/node_modules/aws-sdk/lib/s3/managed_upload.js:389:21) 在上游. (/node_modules/aws-sdk/lib/s3/managed_upload.js:172:28) 在emitNone(events.js:67:13) 在Upstream.emit(events.js:166:7) 在endReadableNT(_stream_visible.js:905:12) 在nextTickCallbackWith2Args(node.js:455:9) 在process._tickDomainCallback(node.js:410:17)

TypeError: buf.copy is not a function at Function.Buffer.concat (buffer.js:237:9) at ManagedUpload.fillStream (/node_modules/aws-sdk/lib/s3/managed_upload.js:389:21) at Upstream. (/node_modules/aws-sdk/lib/s3/managed_upload.js:172:28) at emitNone (events.js:67:13) at Upstream.emit (events.js:166:7) at endReadableNT (_stream_readable.js:905:12) at nextTickCallbackWith2Args (node.js:455:9) at process._tickDomainCallback (node.js:410:17)

推荐答案

从Node 8开始,您可以使用async/await并将文件从本地目录流式传输到S3,如下所示:

Starting in Node 8 you can make use of async/await and stream a file from local directory to S3 as follows:

async function uploadFile(filePath, folderPath)
{
  const readStream = fs.createReadStream(filePath);

  const writeStream = new stream.PassThrough();
  readStream.pipe(writeStream);

  var fname = path.basename(filePath);

  var params = {
        Bucket : 's3-bucket-name',
        Key : folderPath+'/'+fname,
        Body : writeStream
    }

  let uploadPromise = new Promise((resolve, reject) => {
    s3.upload(params, (err, data) => {
      if (err) {
        //logger.error('upload error..', err);
        reject(err);
      } else {
        //logger.debug('upload done..');
        resolve(data);
      }
    });
  });

  var res = await uploadPromise;
  return res;
}

这篇关于将文件直接从Node js req主体加载到S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆