AWS Lambda 将数据放入跨账户 s3 存储桶 [英] AWS Lambda put data to cross account s3 bucket

查看:27
本文介绍了AWS Lambda 将数据放入跨账户 s3 存储桶的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我想要做的:

我在账户 A 中有访问日志,这些日志由 AWS 默认加密,我在账户 B 中有 lambda 和 s3 存储桶.我想在新对象登陆账户 A 时触发 lambda s3 存储桶和账户 B 中的 lambda 下载数据并将其写入帐户 B 的 s3 存储桶.以下是我面临的街区.

I have access logs in account A which are encrypted default by AWS and I have lambda and s3 bucket in account B. I want to trigger the lambda when a new object lands on the account A s3 bucket and lambda in account B downloads the data and writes it to account B s3 bucket. Below are the blocks I am facing.

第一种方法:我能够从帐户 A s3 新对象到帐户 B 中的 lambda 获取触发器,但是,帐户 B 中的 lambda 无法下载该对象 - 拒绝访问错误.找了几天后,我认为这是因为默认情况下访问日志是加密的,我无法将 lambda 角色添加到加密角色策略中,以便它可以加密/解密日志文件.所以转向第二种方法.

First approach: I was able to get the trigger from account A s3 new object to lambda in account B however, the lambda in account B is not able to download the object - Access Denied error. After looking for a couple of days, I figured that it is because the Access logs are encrypted by default and there is no way I can add lambda role to the encryption role policy so that it can encrypt/decrypt the log files. So moved on to the second approach.

第二种方法:我已将 lambda 移至账户 A.现在源 s3 存储桶和 lambda 位于账户 A 中,目标 s3 存储桶位于账户 B 中.现在我可以通过账户 A 中的 Lambda 处理账户 A 中的访问日志,但是当它写入帐户 B s3 存储桶中的文件,我在下载/读取文件时收到拒绝访问错误.

Second approach: I have moved my lambda to Account A. Now the source s3 bucket and lambda are in Account A and destination s3 bucket is in Account B. Now I can process the Access logs in the Account A via Lambda in Account A but when it writes the file in the Account B s3 bucket I get Access denied error while downloaded/reading the file.

Lambda 角色政策:除了完整的 s3 访问权限和完整的 lambda 访问权限.

Lambda role policy: In addition to full s3 access and full lambda access.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1574387531641",
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": "*"
        },
        {
            "Sid": "Stmt1574387531642",
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::Account-B-bucket",
                "arn:aws:s3:::Account-B-bucket/*"
            ]
        }
    ]
}

信任关系

{   "Version": "2012-10-17",   "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com",
        "AWS": "arn:aws:iam::Account-B-ID:root"
      },
      "Action": "sts:AssumeRole"
    }   ] }

目的地 - 账户 B s3 存储桶策略:

Destination - Account B s3 bucket policy:

   {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::Account-A-ID:role/service-role/lambda-role"
                ]
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::Account-B-Bucket",
                "arn:aws:s3:::Account-B-Bucket/*"
            ]
        },
        {
            "Sid": "Stmt11111111111111",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::Account-A-ID:root"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::Account-B-Bucket",
                "arn:aws:s3:::Account-B-Bucket/*"
            ]
        }
    ] }

我被困在这里.我希望 lambda 能够解密访问日志并读取/处理数据并将其写入不同的帐户 s3 存储桶.我错过了什么吗?非常感谢帮助!

I am stuck here. I want lambda to be able to decrypt the access logs and read/process the data and write it to different account s3 bucket. Am I missing something? Help is much appreciated!

添加文件元数据:文件属性截图

Lambda 代码:

s3 = boto3.client('s3')
# reading access logs from account A. Lambda is also running in account A.
response = s3.get_object(Bucket=access_log_bucket, Key=access_log_key)
body = response['Body']
content = io.BytesIO(body.read())
# processing access logs
processed_content = process_it(content)
# writting to account B s3 bucket
s3.put_object(Body=processed_content,
    Bucket=processed_bucket,
    Key=processed_key)

推荐答案

感谢 John Rotenstein 的指导.我找到了解决方案.我只需要在 put_object 中添加 ACL='bucket-owner-full-control'.下面是完整的 boto3 cmd.

Thanks John Rotenstein for the direction. I found the solution. I only needed to add ACL='bucket-owner-full-control' in the put_object. Below is the complete boto3 cmd.

s3.put_object(
    ACL='bucket-owner-full-control'
    Body=processed_content,
    Bucket=processed_bucket,
    Key=processed_key)

这篇关于AWS Lambda 将数据放入跨账户 s3 存储桶的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆