节点AWS-SDK S3文件上传大小 [英] Node aws-sdk s3 file upload sizes
问题描述
在使用AWS-SDK NPM插件nods.js,我可以上传在50KB的PDF具有以下code(AWS的S3):
When using the aws-sdk npm plugin for nods.js, I can upload a pdf on 50kb with the following code (to AWS s3):
var params = {
Bucket: BUCKET,
Key: pdf_key,
Body: file,
ContentType: 'application/pdf'
};
var s3 = new AWS.S3();
s3.putObject(params, function(error, data) {
console.log(data);
console.log(error);
if (error) {
console.log(error);
callback(error, null);
} else {
callback(null, pdf_key);
}
});
但上载11MB的PDF文件时,即使指定 CONTENTLENGTH
,上传只是永远持续,即使有2分钟的超时。
But when uploading a 11mb pdf, even with specifying the ContentLength
, the upload just continues forever, even with a timeout of 2 minutes.
现在的问题是如何让我AWS S3接受较大的PDF文件?
The question is how do I make aws s3 accept the large pdf file?
更新
我仍然没有找到问题的任何文件或anwers。
I have still not found any documentation or anwers for the question.
更新2
我会接受的答案该节目的这种或者其他的框架,可以做到这一点。我需要一个框架,以便能够更能让AUTH阅读的对象。
I will accept answers which show's this or another framework that can do this. I will need that framework to be able to also allow auth-read of the object.
更新3 我懂了工作,但现在我还没有发现它不应该工作的理由。
UPDATE 3 I got it working for now but I haven't found a reason it shouldn't work.
在此先感谢!
推荐答案
连接到S3并不快,然后根据网络波动,你可以得到超时和其他怪异行为。
Connecting to S3 isn't fast and then depending on the network fluctuations you can get timeouts and other weird behaviors.
您提供的code是很好,但你可以采取多部分上传,可以解决问题,尤其是> 5MB文件的优势。
The code you provided is fine, but you could take advantage of multipart uploads that could solve problems especially with >5MB files.
我做一个粗略实现一个多上传的同时也使得重试任何失败的部分高达上传3次,这也将适用于超过5MB较小的文件。
I made a rough implementation of a multipart upload and also made it to retry the upload of any failing part up to 3 times, this will also work for smaller files than 5MB.
这篇关于节点AWS-SDK S3文件上传大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!