通过Django和Boto3上传到AWS S3存储桶时,图像文件被切断 [英] Image file cut off when uploading to AWS S3 bucket via Django and Boto3

查看:107
本文介绍了通过Django和Boto3上传到AWS S3存储桶时,图像文件被切断的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我将较大的图像(3 MB以上)上传到AWS S3存储桶时,只有部分图像被保存到存储桶中(大约占图像的前10%,其余显示为灰色空间).这些图像始终显示256 KB大小.较小的文件没有任何问题.

When I upload a larger image (3+ MB) to an AWS S3 bucket, only part of the image is being being saved to the bucket (about the top 10% of the image, the rest displaying as grey space). These images consistently show 256 KB size. There isn't any issue with smaller files.

这是我的代码:

s3 = boto3.resource('s3') s3.Bucket(settings.AWS_MEDIA_BUCKET_NAME).put_object(Key = fname,Body = data)

s3 = boto3.resource('s3') s3.Bucket(settings.AWS_MEDIA_BUCKET_NAME).put_object(Key=fname, Body=data)

...其中数据是图像文件的二进制数据.

...where data is binary data of image file.

文件较小时没有问题,在S3存储桶中,较大的文件全部显示为256 KB.

No issues when files are smaller size, and in the S3 bucket the larger files all show as 256 KB.

我找不到任何有关可能发生这种情况的文档.有人可以指出我所缺少的吗?

I haven't been able to find any documentation about why this might be happening. Can someone please point out what I'm missing?

谢谢!

推荐答案

我遇到了同样的问题,花了我几个小时才弄清楚.我终于通过创建一个流来修复它.这是我的代码:

I had the same issue and it took me hours to figure it out. I finally fixed it by creating a stream. This is my code:

const uploadFile = (filePath) => {
  let fileName = filePath;
  fs.readFile(fileName, (err, data) => {

  let body= fs.createReadStream(filePath);

   if (err) throw err;
   const params = {
      Bucket: 'bucketname', // pass your bucket name
      Key: fileName; 
      Body: body,
      ContentType: 'image/jpeg',
      ContentEncoding: 'base64',
   };
   s3.upload(params, function(s3Err, data) {
       if (s3Err) throw s3Err;
       console.log(`File uploaded successfully at ${data.Location}`);
       });
    });
};

这篇关于通过Django和Boto3上传到AWS S3存储桶时,图像文件被切断的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆