推二进制数据到Amazon S3使用Node.js的 [英] Pushing binary data to Amazon S3 using Node.js

查看:204
本文介绍了推二进制数据到Amazon S3使用Node.js的的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图拍摄图像,并把它上传到使用node.js中一个Amazon S3存储最后,我希望能够推到的图像S3,然后就可以访问S3网址,看到的图像在浏览器中。我使用的是卷曲的查询做的图像作为身体的HTTP POST请求。

I'm trying to take an image and upload it to an Amazon S3 bucket using Node.js. In the end, I want to be able to push the image up to S3, and then be able to access that S3 URL and see the image in a browser. I'm using a Curl query to do an HTTP POST request with the image as the body.

卷曲-kvX POST --data二进制@ test.jpg放在'的http://本地主机:3031 /上传/图像

然后在Node.js的一面,我这样做:

Then on the Node.js side, I do this:

exports.pushImage = function(req, res) {
    var image = new Buffer(req.body);
    var s3bucket = new AWS.S3();
    s3bucket.createBucket(function() {
        var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: image};
        // Put the object into the bucket.
        s3bucket.putObject(params, function(err) {
            if (err) {
                res.writeHead(403, {'Content-Type':'text/plain'});
                res.write("Error uploading data");
                res.end()
            } else {
                res.writeHead(200, {'Content-Type':'text/plain'});
                res.write("Success");
                res.end()
            }
        });
    });
};

我的文件是0字节,如Amazon S3上。我该如何让这个我可以使用Node.js的二进制文件推到S3?我在做什么错误的二进制数据和缓冲区?

My file is 0 bytes, as shown on Amazon S3. How do I make it so that I can use Node.js to push the binary file up to S3? What am I doing wrong with binary data and buffers?

更新:

我发现了什么我需要做的。卷曲查询是应改变的第一件事。这是工作之一:

I found out what I needed to do. The curl query is the first thing that should be changed. This is the working one:

卷曲-kvX POST -F foobar=@my_image_name.jpg'的http://本地主机:3031 /上传/图像

然后,我添加了一行转换为流。这是工作code:

Then, I added a line to convert to a Stream. This is the working code:

exports.pushImage = function(req, res) {
    var image = new Buffer(req.body);
    var s3bucket = new AWS.S3();
    s3bucket.createBucket(function() {
        var bodyStream = fs.createReadStream(req.files.foobar.path);
        var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: bodyStream};
        // Put the object into the bucket.
        s3bucket.putObject(params, function(err) {
            if (err) {
                res.writeHead(403, {'Content-Type':'text/plain'});
                res.write("Error uploading data");
                res.end()
            } else {
                res.writeHead(200, {'Content-Type':'text/plain'});
                res.write("Success");
                res.end()
            }
        });
    });
};

因此​​,为了将文件上传到API端点(使用的Node.js和Ex preSS),并有API推送文件到Amazon S3,首先你需要执行的文件的POST请求字段填充。该文件最终在API方面,它驻留可能在某些tmp目录。 Amazon的S3 putObject方法需要一个流,所以你需要通过为FS模块,其中上传的文件所在的路径,创建一个读数据流。

So, in order to upload a file to an API endpoint (using Node.js and Express) and have the API push that file to Amazon S3, first you need to perform a POST request with the "files" field populated. The file ends up on the API side, where it resides probably in some tmp directory. Amazon's S3 putObject method requires a Stream, so you need to create a read stream by giving the 'fs' module the path where the uploaded file exists.

我不知道这是否是上传数据的正确方法,但它的作品。有谁知道,如果有一种方法来POST二进制数据的请求体内,并具有API发送到S3?我不太知道其中的差别是多部分上传VS标准的POST身体之间​​有什么。

I don't know if this is the proper way to upload data, but it works. Does anyone know if there is a way to POST binary data inside the request body and have the API send that to S3? I don't quite know what the difference is between a multi-part upload vs a standard POST to body.

推荐答案

我相信你需要传递的内容长度的头所记录的S3文档:的 http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html

I believe you need to pass the content-length in the header as documented on the S3 docs: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html

花费的时间相当多的工作在推动资产S3之后,我结束了使用AwsSum库,在生产优异的成绩:

After spending quite a bit of time working on pushing assets to S3, I ended up using the AwsSum library with excellent results in production:

https://github.com/awssum/awssum-amazon-s3/

(请参见设置您的AWS凭据的文档)

(See the documentation on setting your AWS credentials)

例如:

var fs = require('fs');
var bucket_name = 'your-bucket name'; // AwsSum also has the API for this if you need to create the buckets

var img_path = 'path_to_file';
var filename = 'your_new_filename';

// using stat to get the size to set contentLength
fs.stat(img_path, function(err, file_info) {

    var bodyStream = fs.createReadStream( img_path );

    var params = {
        BucketName    : bucket_name,
        ObjectName    : filename,
        ContentLength : file_info.size,
        Body          : bodyStream
    };

    s3.PutObject(params, function(err, data) {
        if(err) //handle
        var aws_url = 'https://s3.amazonaws.com/' + DEFAULT_BUCKET + '/' + filename;
    });

});

更新

所以,如果你使用的是像前preSS或连接上构建厉害,那么你就不能访问文件流的厉害将文件写入磁盘。所以,这取决于你如何上传在客户端的图像将位于 req.body req.files 。就我而言,我用防爆preSS和客户端上,我发布的其他数据,以及使图像有它自己的参数,并访问为 req.files.img_data 。然而,你访问它,即参数是你传递的 img_path 在上面的例子。

So, if you are using something like Express or Connect which are built on Formidable, then you don't have access to the file stream as Formidable writes files to disk. So depending on how you upload it on the client side the image will either be in req.body or req.files. In my case, I use Express and on the client side, I post other data as well so the image has it's own parameter and is accessed as req.files.img_data. However you access it, that param is what you pass in as img_path in the above example.

如果你需要/想流是棘手的文件,但肯定是可能的,如果你不处理图像,你可能想看看采取CORS方法,并直接上传到S3为这里讨论:<一href="http://stackoverflow.com/questions/14733928/stream-that-user-uploads-directly-to-amazon-s3">Stream直接用户上传到Amazon S3

If you need to / want to Stream the file that is trickier, though certainly possible and if you aren't manipulating the image you may want to look at taking a CORS approach and uploading directly to S3 as discussed here: Stream that user uploads directly to Amazon s3

这篇关于推二进制数据到Amazon S3使用Node.js的的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆