将超过 100 天的 S3 文件移动到另一个存储桶 [英] Move S3 files older than 100 days to another bucket

查看:18
本文介绍了将超过 100 天的 S3 文件移动到另一个存储桶的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有没有办法在一个 S3 存储桶中找到所有超过 100 天的文件并将它们移动到另一个存储桶?欢迎使用 AWS CLI 或 SDK 的解决方案.在 src 存储桶中,文件的组织方式类似于存储桶/类型/年/月/日/小时/文件
S3://my-logs-bucket/logtype/2020/04/30/16/logfile.csv
例如,在 2020/04/30 上,必须移动 2020/01/21 或之前的日志文件.

Is there a way to find all files that are older than 100 days in one S3 bucket and move them to a different bucket? Solutions using AWS CLI or SDK both welcome. In the src bucket, the files are organized like bucket/type/year/month/day/hour/file
S3://my-logs-bucket/logtype/2020/04/30/16/logfile.csv
For instance, on 2020/04/30, log files on or before 2020/01/21 will have to be moved.

推荐答案

正如我在评论中提到的,您可以为 S3 存储桶创建生命周期策略.以下是执行此操作的步骤 https://docs.aws.amazon.com/AmazonS3/latest/user-guide/create-lifecycle.html

As mentioned in my comments you can create a lifecycle policy for an S3 bucket. Here is steps to do it https://docs.aws.amazon.com/AmazonS3/latest/user-guide/create-lifecycle.html

使用生命周期策略规则删除/过期对象是可选的,您可以在 S3 存储桶中的对象上定义所需的操作.

It's optional to delete\expire an object using Lifecycle policy rules, you define the actions you want on the objects in your S3 bucket.

生命周期策略使用不同的存储类来转换您的对象.在配置生命周期策略之前,我建议阅读不同的存储类,因为每个类都有自己的相关成本:Standard-IA、One Zone-IA、Glacier 和 Deep Archive 存储类

Lifecycle policies uses different storage classes to transition your objects. Before configuring Lifecycle policies I suggest reading up on the different storage classes as each have their own associated cost: Standard-IA, One Zone-IA, Glacier, and Deep Archive storage classes

您的 100 天用例,我建议将您的日志转换为存档存储类,例如 S3 Glacier.事实证明,这可能更具成本效益.

Your use case of 100 days, I recommend transitioning your logs to a archive storage class such as S3 Glacier. This might prove to be more cost effective.

这篇关于将超过 100 天的 S3 文件移动到另一个存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆