S3文件上传流使用节点js [英] S3 file upload stream using node js

查看:239
本文介绍了S3文件上传流使用节点js的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述




  • 不要存储临时文件文件在服务器上或内存中。但是最新的一些限制不完整的文件,缓冲可以用于上传。


  • 不要冻结服务器直到完成文件上传,因为在繁重的文件上传的情况下,其他请求的等待时间将会出乎意料地
    增加。


    我不想使用从浏览器直接上传文件,因为S3证书需要共享案件。从节点js服务器上传文件的另一个原因是,在上传文件之前可能还需要应用一些认证。

    我尝试使用节点多方来实现这一点。但这并不像预期的那样工作。您可以在 https://github.com/andrewrk/node-multiparty查看我的解决方案/问题/ 49 。它适用于小文件,但对于大小为15MB的文件无法正常工作。

    任何解决方案或替代方案? 解决方案

您现在可以使用官方的Amazon SDK for nodejs ,更令人敬畏的是,您终于可以在不知道文件大小的情况下进行操作,只需将该流作为 Body code>:

$ $ $ $ $ $ $ $ $ $ $ $'$'$'$'
var zlib = require('zlib');

var body = fs.createReadStream('bigfile')。pipe(zlib.createGzip());
var s3obj = new AWS.S3({params:{Bucket:'myBucket',Key:'myKey'}});
s3obj.upload({Body:body})
.on('httpUploadProgress',function(evt){console.log(evt);})
.send(function(err,数据){console.log(err,data)});


I am trying to find some solution to stream file on amazon S3 using node js server with requirements:

  • Don't store temp file on server or in memory. But up-to some limit not complete file, buffering can be used for uploading.
  • No restriction on uploaded file size.
  • Don't freeze server till complete file upload because in case of heavy file upload other request's waiting time will unexpectedly increase.

I don't want to use direct file upload from browser because S3 credentials needs to share in that case. One more reason to upload file from node js server is that some authentication may also needs to apply before uploading file.

I tried to achieve this using node-multiparty. But it was not working as expecting. You can see my solution and issue at https://github.com/andrewrk/node-multiparty/issues/49. It works fine for small files but fails for file of size 15MB.

Any solution or alternative ?

解决方案

You can now use streaming with the official Amazon SDK for nodejs and what's even more awesome, you finally can do so without knowing the file size in advance, simply pass the stream as the Body:

var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body})
  .on('httpUploadProgress', function(evt) { console.log(evt); })
  .send(function(err, data) { console.log(err, data) });

这篇关于S3文件上传流使用节点js的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆