我们可以使用boto3 Python在aws s3存储桶之间以递归方式复制文件和文件夹吗? [英] Can we copy the files and folders recursively between aws s3 buckets using boto3 Python?
问题描述
是否可以使用boto3将一个源存储桶中的所有文件复制到另一目标存储桶.而且源存储桶没有常规的文件夹结构.
Is it possible to copy all the files in one source bucket to other target bucket using boto3. And source bucket doesn't have regular folder structure.
Source bucket: SRC
Source Path: A/B/C/D/E/F..
where in D folder it has some files,
E folder has some files
Target bucket: TGT
Target path: L/M/N/
我需要使用boto3将SRC桶上方的所有文件和文件夹从文件夹C复制到N文件夹下的TGT桶.
I need to copy all the files and folders from above SRC bucket from folder C to TGT bucket under N folder using boto3.
任何人都可以知道任何API还是我们需要编写新的python脚本来完成此任务.
Can any one aware of any API or do we need to write new python script to complete this task.
推荐答案
S3存储对象,它不存储文件夹,即使'/'或'\'是对象键名的一部分.您只需要操纵键名并复制数据即可.
S3 store object, it doesn't store folder, even '/' or '\' is part of the object key name. You just need to manipulate the key name and copy the data over.
import boto3
old_bucket_name = 'SRC'
old_prefix = 'A/B/C/'
new_bucket_name = 'TGT'
new_prefix = 'L/M/N/'
s3 = boto3.resource('s3')
old_bucket = s3.Bucket(old_bucket_name)
new_bucket = s3.Bucket(new_bucket_name)
for obj in old_bucket.objects.filter(Prefix=old_prefix):
old_source = { 'Bucket': old_bucket_name,
'Key': obj.key}
# replace the prefix
new_key = obj.key.replace(old_prefix, new_prefix, 1)
new_obj = new_bucket.Object(new_key)
new_obj.copy(old_source)
zvikico建议的定义new_key
的优化技术:
Optimized technique of defining new_key
suggested by zvikico:
new_key = new_prefix + obj.key[len(old_prefix):]
这篇关于我们可以使用boto3 Python在aws s3存储桶之间以递归方式复制文件和文件夹吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!