使用IAM角色运行Spark EC2脚本 [英] Running Spark EC2 scripts with IAM role
问题描述
我正在尝试运行Spark EC2脚本以在IAM角色下启动群集,我的根帐户下的用户可以承担这个角色。
I am trying to run Spark EC2 scripts to launch a cluster under an IAM role which my user under my root account can assume.
根据这张JIRA票据,我们现在可以在运行时指定-profile
Spark EC2脚本,对请求请求的评论说, --profile
选项是指我认为的AWSCLI配置文件。
According to this JIRA ticket, we can now specify --profile
when running Spark EC2 scripts, and the comments on the pull request say that the --profile
option refers to what I believe as the AWSCLI profile.
当我以
ec2/spark-ec2 -k key-name -i key-name.pem -s 1 --profile myprofile --instance-type=t2.medium launch test-cluster
我得到
Profile "myprofile" not found!
但是,正在运行
aws s3 ls s3://mybucket --profile myprofile
按预期工作,导致我认为IAM角色是在〜/ .aws / config
中正确指定的(我不认为您在中指定了IAM角色〜/ .aws / credentials
)。
works as intended, leading my to think the IAM role was specified correctly in the ~/.aws/config
(I don't think you specify IAM roles in the ~/.aws/credentials
).
但是,当我向〜/ .aws添加测试配置文件时/ credentials
as
[foobar]
aws_secret_access_key=xxxxxxx
aws_access_key_id=xxxxxxx
Spark查找 foobar
配置文件。但是,在添加
Spark finds the foobar
profile. However, after adding
[foobar]
role_arn = arn:aws:iam::12345:role/MY_ROLE
aws_secret_access_key=xxxxxxx
aws_access_key_id=xxxxxxx
Spark发现 foobar
配置文件,但没有正确登录到IAM角色。我得到
Spark finds the foobar
profile, but it does not correctly log into the IAM role. I get
boto.exception.EC2ResponseError: EC2ResponseError: 400 Bad Request
<?xml version="1.0" encoding="UTF-8"?>
<Response><Errors><Error><Code>InvalidKeyPair.NotFound</Code><Message>The key pair 'key-name' does not exist</Message></Error></Errors><RequestID>fcebd475-a895-4a5b-9a29-9783fd6b7f3d</RequestID></Response>
这是因为密钥对密钥名
在我的用户下不存在,但在我需要承担的IAM角色下确实存在。这表明Spark无法正确登录IAM角色。
This is because the key pair key-name
does not exist under my user, but it does exist under the IAM role I need to assume. This tells me Spark is not properly logging into the IAM role.
我的〜/ .aws / config
:
[default]
region = us-east-1
aws_secret_access_key = xxxxx
aws_access_key_id = xxxxx
[profile myprofile]
role_arn = arn:aws:iam::12345:role/MY_ROLE
source_profile = default
我的〜/ .aws / credentials
:
[default]
aws_secret_access_key = xxxxx
aws_access_key_id = xxxxx
侧注-也尝试过:
通过
aws sts assume-role --role-arn arn:aws:iam::12345:role/MY_ROLE --role-session-name temp-session
然后导出 AWS_SECRET_ACCESS_KEY
, AWS_SESSION_TOKEN
和 AWS_ACCESS_KEY_ID
。
然后我运行了没有指定任何配置文件的EC2脚本,并得到了
then exporting the AWS_SECRET_ACCESS_KEY
, AWS_SESSION_TOKEN
, and AWS_ACCESS_KEY_ID
to the environment variables.
I then ran the EC2 scripts without any profile specified and got
boto.exception.EC2ResponseError: EC2ResponseError: 401 Unauthorized
<?xml version="1.0" encoding="UTF-8"?>
<Response><Errors><Error><Code>AuthFailure</Code><Message>AWS was not able to validate the provided access credentials</Message></Error></Errors><RequestID>11402f6e-074c-478c-84c1-11fb92ad0bff</RequestID></Response>
侧注-也尝试过:
根据此JIRA on Spark脚本具有IAM角色,我们可以指定-instance-profile-name
(是实例配置文件以这种方式使用IAM角色的唯一方法吗?我是否需要向管理员询问IAM列表/创建权限才能启动具有IAM角色的群集?)。我试过使用 arn:aws:iam :: 12345:role / MY_ROLE
和 MY_ROLE
但得到
According to this JIRA on Spark scripts with IAM roles, we can specify --instance-profile-name
(is an instance profile the only way of using an IAM role this way? ie.. would I have to ask our admin for IAM list/create permissions to launch a cluster with an IAM role?). I have tried using arn:aws:iam::12345:role/MY_ROLE
and MY_ROLE
but get
boto.exception.EC2ResponseError: EC2ResponseError: 400 Bad Request
<?xml version="1.0" encoding="UTF-8"?>
<Response><Errors><Error><Code>InvalidParameterValue</Code><Message>Value (arn:aws:iam::12345:role/MY_ROLE) for parameter iamInstanceProfile.name is invalid. Invalid IAM Instance Profile name</Message></Error></Errors><RequestID>ffeffef9-acad-4a34-a925-31f6b5bbbb3e</RequestID></Response>
推荐答案
我通过提供带有spark-ec2脚本的'--instance-profile-name'参数,您可以传递配置文件名称。
I managed assigning a role to an ec2 instance by providing the '--instance-profile-name' parameter with the spark-ec2 script which you can pass a profile name.
在实例内部确保运行
sudo yum update
还要看我的问题:
使用以下命令运行Spark EC2脚本IAM角色
祝你好运
这篇关于使用IAM角色运行Spark EC2脚本的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!