Python:快速上传大型文件S3 [英] Python: upload large files S3 fast

查看:129
本文介绍了Python:快速上传大型文件S3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试以编程方式在S3上上传一个最大为1GB的非常大的文件.当我发现AWS S3支持大文件的分段上传时,我发现了一些Python代码可以做到这一点.(链接)

I am trying to upload programmatically an very large file up to 1GB on S3. As I found that AWS S3 supports multipart upload for large files, and I found some Python code to do it. (link )

我的观点:上传速度太慢(将近1分钟).

My point: the speed of upload was too slow (almost 1 min).

有什么方法可以提高分段上传的性能.还是任何支持S3上传的优秀库

Is there any way to increase the performance of multipart upload. Or any good library support S3 uploading

推荐答案

在这里将我的答案留作参考,使用以下代码可使性能提高两倍:

Leave my answer here for ref, the performance increase twice with this code:

import boto3
from boto3.s3.transfer import TransferConfig


s3_client = boto3.client('s3')

S3_BUCKET = 'mybucket'
FILE_PATH = '/path/to/file/'
KEY_PATH = "/path/to/s3key/" 

def uploadFileS3(filename):
    config = TransferConfig(multipart_threshold=1024*25, max_concurrency=10,
                        multipart_chunksize=1024*25, use_threads=True)
    file = FILE_PATH + filename
    key = KEY_PATH + filename
    s3_client.upload_file(file, S3_BUCKET, key,
    ExtraArgs={ 'ACL': 'public-read', 'ContentType': 'video/mp4'},
    Config = config,
    Callback=ProgressPercentage(file)
    )

uploadFileS3('upload.mp4')

特别感谢@BryceH的建议.尽管解决方案确实提高了S3上传的性能,但是我仍然愿意接受任何更好的解决方案.谢谢

Special thank to @BryceH for suggestion. Although solution did increase the performance of S3 uploading, but I still open to receive any better solution. Thanks

这篇关于Python:快速上传大型文件S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆