更快的s3存储桶复制 [英] Faster s3 bucket duplication

查看:142
本文介绍了更快的s3存储桶复制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在尝试寻找一种比 s3cmd 更好的命令行工具来复制存储桶. s3cmd可以复制存储桶,而不必下载和上传每个文件.我通常使用s3cmd运行以复制存储桶的命令是:

I have been trying to find a better command line tool for duplicating buckets than s3cmd. s3cmd can duplicate buckets without having to download and upload each file. The command I normally run to duplicate buckets using s3cmd is:

s3cmd cp -r --acl-public s3://bucket1 s3://bucket2

这可行,但是它非常慢,因为它一次通过API复制了每个文件.如果s3cmd可以并行运行,

This works, but it is very slow as it copies each file via the API one at a time. If s3cmd could run in parallel mode, I'd be very happy.

人们是否还可以使用其他选项作为命令行工具或代码来复制比s3cmd更快的存储桶?

Are there other options available as a command line tools or code that people use to duplicate buckets that are faster than s3cmd?

看起来 s3cmd-modification 正是我想要的.太糟糕了,它不起作用.还有其他选择吗?

Looks like s3cmd-modification is exactly what I'm looking for. Too bad it does not work. Are there any other options?

推荐答案

AWS CLI似乎可以完美地完成工作,并且具有被官方支持的工具的优点.

AWS CLI seems to do the job perfectly, and has the bonus of being an officially supported tool.

aws s3 sync s3://mybucket s3://backup-mybucket

http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

默认情况下支持并发传输.请参见 http://docs.aws. amazon.com/cli/latest/topic/s3-config.html#max-concurrent-requests

Supports concurrent transfers by default. See http://docs.aws.amazon.com/cli/latest/topic/s3-config.html#max-concurrent-requests

要快速传输大量小文件,请从EC2实例运行脚本以减少延迟,并增加max_concurrent_requests以减少延迟的影响.例如:

To quickly transfer a huge number of small files, run the script from an EC2 instance to decrease latency, and increase max_concurrent_requests to reduce the impact of latency. Eg:

aws configure set default.s3.max_concurrent_requests 200

这篇关于更快的s3存储桶复制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆