使用 Node.js 将二进制数据推送到 Amazon S3 [英] Pushing binary data to Amazon S3 using Node.js

查看:12
本文介绍了使用 Node.js 将二进制数据推送到 Amazon S3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 Node.js 拍摄图像并将其上传到 Amazon S3 存储桶.最后,我希望能够将图像推送到 S3,然后能够访问该 S3 URL 并在浏览器中查看图像.我正在使用 Curl 查询以图像为主体执行 HTTP POST 请求.

I'm trying to take an image and upload it to an Amazon S3 bucket using Node.js. In the end, I want to be able to push the image up to S3, and then be able to access that S3 URL and see the image in a browser. I'm using a Curl query to do an HTTP POST request with the image as the body.

curl -kvX POST --data-binary "@test.jpg" 'http://localhost:3031/upload/image'

然后在 Node.js 方面,我这样做:

Then on the Node.js side, I do this:

exports.pushImage = function(req, res) {
    var image = new Buffer(req.body);
    var s3bucket = new AWS.S3();
    s3bucket.createBucket(function() {
        var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: image};
        // Put the object into the bucket.
        s3bucket.putObject(params, function(err) {
            if (err) {
                res.writeHead(403, {'Content-Type':'text/plain'});
                res.write("Error uploading data");
                res.end()
            } else {
                res.writeHead(200, {'Content-Type':'text/plain'});
                res.write("Success");
                res.end()
            }
        });
    });
};

我的文件是 0 字节,如 Amazon S3 上所示.我该如何做才能使用 Node.js 将二进制文件推送到 S3?我对二进制数据和缓冲区做错了什么?

My file is 0 bytes, as shown on Amazon S3. How do I make it so that I can use Node.js to push the binary file up to S3? What am I doing wrong with binary data and buffers?

更新:

我发现了我需要做的事情.curl 查询是应该更改的第一件事.这是有效的:

I found out what I needed to do. The curl query is the first thing that should be changed. This is the working one:

curl -kvX POST -F foobar=@my_image_name.jpg 'http://localhost:3031/upload/image'

然后,我添加了一行来转换为 Stream.这是工作代码:

Then, I added a line to convert to a Stream. This is the working code:

exports.pushImage = function(req, res) {
    var image = new Buffer(req.body);
    var s3bucket = new AWS.S3();
    s3bucket.createBucket(function() {
        var bodyStream = fs.createReadStream(req.files.foobar.path);
        var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: bodyStream};
        // Put the object into the bucket.
        s3bucket.putObject(params, function(err) {
            if (err) {
                res.writeHead(403, {'Content-Type':'text/plain'});
                res.write("Error uploading data");
                res.end()
            } else {
                res.writeHead(200, {'Content-Type':'text/plain'});
                res.write("Success");
                res.end()
            }
        });
    });
};

因此,为了将文件上传到 API 端点(使用 Node.js 和 Express)并让 API 将该文件推送到 Amazon S3,首先您需要执行一个 POST 请求并填充文件"字段.该文件最终位于 API 端,它可能位于某个 tmp 目录中.Amazon 的 S3 putObject 方法需要 Stream,因此您需要通过为fs"模块提供上传文件所在的路径来创建读取流.

So, in order to upload a file to an API endpoint (using Node.js and Express) and have the API push that file to Amazon S3, first you need to perform a POST request with the "files" field populated. The file ends up on the API side, where it resides probably in some tmp directory. Amazon's S3 putObject method requires a Stream, so you need to create a read stream by giving the 'fs' module the path where the uploaded file exists.

我不知道这是否是上传数据的正确方法,但它确实有效.有谁知道是否有办法在请求正文中发布二进制数据并让 API 将其发送到 S3?我不太清楚多部分上传与标准 POST 到正文之间有什么区别.

I don't know if this is the proper way to upload data, but it works. Does anyone know if there is a way to POST binary data inside the request body and have the API send that to S3? I don't quite know what the difference is between a multi-part upload vs a standard POST to body.

推荐答案

我相信您需要在标头中传递内容长度,如 S3 文档中所述:http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html

I believe you need to pass the content-length in the header as documented on the S3 docs: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html

在花费大量时间将资产推送到 S3 之后,我最终使用了 AwsSum 库,并在生产中取得了出色的效果:

After spending quite a bit of time working on pushing assets to S3, I ended up using the AwsSum library with excellent results in production:

https://github.com/awssum/awssum-amazon-s3/

(请参阅有关设置 AWS 凭证的文档)

(See the documentation on setting your AWS credentials)

例子:

var fs = require('fs');
var bucket_name = 'your-bucket name'; // AwsSum also has the API for this if you need to create the buckets

var img_path = 'path_to_file';
var filename = 'your_new_filename';

// using stat to get the size to set contentLength
fs.stat(img_path, function(err, file_info) {

    var bodyStream = fs.createReadStream( img_path );

    var params = {
        BucketName    : bucket_name,
        ObjectName    : filename,
        ContentLength : file_info.size,
        Body          : bodyStream
    };

    s3.putObject(params, function(err, data) {
        if(err) //handle
        var aws_url = 'https://s3.amazonaws.com/' + DEFAULT_BUCKET + '/' + filename;
    });

});

更新

因此,如果您使用的是基于 Formidable 构建的 Express 或 Connect,那么您无法访问文件流,因为 Formidable 会将文件写入磁盘.因此,根据您在客户端上传的方式,图像将位于 req.bodyreq.files 中.在我的例子中,我使用 Express,在客户端,我也发布了其他数据,因此图像有它自己的参数,并作为 req.files.img_data 访问.无论您如何访问它,该参数都是您在上述示例中作为 img_path 传入的参数.

So, if you are using something like Express or Connect which are built on Formidable, then you don't have access to the file stream as Formidable writes files to disk. So depending on how you upload it on the client side the image will either be in req.body or req.files. In my case, I use Express and on the client side, I post other data as well so the image has it's own parameter and is accessed as req.files.img_data. However you access it, that param is what you pass in as img_path in the above example.

如果您需要/想要流式传输更棘手的文件,但肯定有可能,并且如果您不处理图像,您可能需要考虑采用 CORS 方法并直接上传到 S3,如下所述:用户直接上传到 Amazon s3 的流

If you need to / want to Stream the file that is trickier, though certainly possible and if you aren't manipulating the image you may want to look at taking a CORS approach and uploading directly to S3 as discussed here: Stream that user uploads directly to Amazon s3

这篇关于使用 Node.js 将二进制数据推送到 Amazon S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆