使用节点js的S3文件上传流 [英] S3 file upload stream using node js

查看:25
本文介绍了使用节点js的S3文件上传流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用具有要求的节点 js 服务器在 amazon S3 上找到一些流文件的解决方案:

I am trying to find some solution to stream file on amazon S3 using node js server with requirements:

  • 不要将临时文件存储在服务器或内存中.但在一定限度内不完整的文件,可以使用缓冲上传.
  • 对上传的文件大小没有限制.
  • 在文件上传完成之前不要冻结服务器,因为在大量文件上传的情况下,其他请求的等待时间会意外增加.

我不想使用从浏览器直接上传文件,因为在这种情况下需要共享 S3 凭据.从 node js 服务器上传文件的另一个原因是在上传文件之前可能还需要申请一些身份验证.

I don't want to use direct file upload from browser because S3 credentials needs to share in that case. One more reason to upload file from node js server is that some authentication may also needs to apply before uploading file.

我尝试使用 node-multiparty 来实现这一点.但它并没有像预期的那样工作.您可以在 https://github.com/andrewrk/node-multiparty 查看我的解决方案和问题/问题/49.它适用于小文件,但不适用于 15MB 的文件.

I tried to achieve this using node-multiparty. But it was not working as expecting. You can see my solution and issue at https://github.com/andrewrk/node-multiparty/issues/49. It works fine for small files but fails for file of size 15MB.

任何解决方案或替代方案?

Any solution or alternative ?

推荐答案

您现在可以通过 用于 nodejs 的官方 Amazon SDK 在将文件上传到 Amazon S3 存储桶"部分中或查看他们的 GitHub 上的示例.

You can now use streaming with the official Amazon SDK for nodejs in the section "Uploading a File to an Amazon S3 Bucket" or see their example on GitHub.

更棒的是,您终于可以事先不知道文件大小.只需将流作为 Body 传递:

What's even more awesome, you finally can do so without knowing the file size in advance. Simply pass the stream as the Body:

var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body})
  .on('httpUploadProgress', function(evt) { console.log(evt); })
  .send(function(err, data) { console.log(err, data) });

这篇关于使用节点js的S3文件上传流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆