以csv格式列出其大小的s3存储桶 [英] List s3 buckets with its size in csv format

查看:63
本文介绍了以csv格式列出其大小的s3存储桶的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在csv中列出其大小的s3存储桶.

I am trying to list the s3 buckets with its size in csv.

  • 桶名称大小
  • A桶2 GB
  • B桶10 GB

正在寻找类似的东西...

Looking for something like this...

我可以用以下代码列出存储桶.

I can list the buckets with the below code.

def main():
    with open('size.csv', 'w') as csvfile:
        writer = csv.writer(csvfile)
        writer.writerow([
            'Bucket Name',
            'Bucket Size'

        ])
        with open('accountroles.json') as ec2_file:
            ec2_data = json.load(ec2_file)
        region_list = ['us-west-1']
        for region in region_list:
            for index in range(len(ec2_data['Items'])):
                Account_Number = ec2_data['Items'][index]['Aws_Account_Number']
                Account_Name = ec2_data['Items'][index]['Acc_Name']
                ARN = ec2_data['Items'][index]['ARN']
                b = get_assume_arn_to_keys(Account_Number,Account_Name,ARN)
                #ds_client = boto3.client('s3',region_name=region,aws_access_key_id=``,aws_secret_access_key=``,aws_session_token=``)
                ds_client = boto3.client('s3',region_name=region,aws_access_key_id=b[1],aws_secret_access_key=b[2],aws_session_token=b[3])

                #s3_client = boto3.client('s3')

                bucket_list = ds_client.list_buckets()

                for bucket in bucket_list['Buckets']:
                    ************
                     ??????????
                    writer.writerow([
                        Account_Name,
                        #region,
                        bucket['Name'],
                        Bucketsize

                                   ])


main()

我可以列出存储桶.请帮助我如何继续获取尺寸.我提到的很少,而且似乎可以通过CW指标获得.有什么办法.

I can list the bucket. Please help me how to proceed with getting the sizes. I have referred few and seems the size can be got with CW metrics. Is there any way.

帮我编写脚本.

编辑/更新:

                bucket_list = ds_client.list_buckets()

                for bucket in bucket_list['Buckets']:

                    try:
                        lifecycle = ds_client.get_bucket_lifecycle(Bucket=bucket['Name'])
                        rules = lifecycle['Rules']
                    except:
                        rules = 'No Policy'
                    try:
                        encryption = ds_client.get_bucket_encryption(Bucket=bucket['Name'])
                        Encryptiontype = encryption['ServerSideEncryptionConfiguration']['Rules']
                    except:
                        Encryptiontype = 'Not Encrypted'

                    print(bucket['Name'], rules, Encryptiontype)

谢谢

推荐答案

以下是一些代码,这些代码将计算存储桶的大小.我已经将它作为一个函数来完成,因此您可以将其合并到您的代码中:

Here's some code that will calculate the size of a bucket. I've done it as a function so you can incorporate it into your code:

import boto3

def bucket_size(bucket):
    size = 0

    s3_client = boto3.client('s3')

    paginator = s3_client.get_paginator('list_objects_v2')
    page_iterator = paginator.paginate(Bucket = bucket)

    for page in page_iterator:
        for object in page['Contents']:
            size += object['Size']

    # Return size in MB (rounded)
    return size // 1024

# Call function
size = bucket_size('my-bucket')
print(size)

我使用了页面迭代器,以防存储桶中有1000个以上的对象.

I used a page iterator just in case you have more than 1000 objects in a bucket.

(除了根据本网站的服务条款授予的许可外,该帖子的内容还根据MIT-0进行许可.)

(In addition to the license granted under the terms of service of this site the contents of this post are licensed under MIT-0.)

这篇关于以csv格式列出其大小的s3存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆