从boto3调用AWS Glue Pythonshell作业时出现参数错误 [英] arguments error while calling an AWS Glue Pythonshell job from boto3
问题描述
基于上一篇文章,我有一个AWS Glue Pythonshell作业,需要从通过boto3调用传递给它的参数中检索一些信息.
Based on the previous post, I have an AWS Glue Pythonshell job that needs to retrieve some information from the arguments that are passed to it through a boto3 call.
我的胶水工作名称是test_metrics
Glue pythonshell代码如下所示
The Glue pythonshell code looks like below
import sys
from awsglue.utils import getResolvedOptions
args = getResolvedOptions(sys.argv,
['test_metrics',
's3_target_path_key',
's3_target_path_value'])
print ("Target path key is: ", args['s3_target_path_key'])
print ("Target Path value is: ", args['s3_target_path_value'])
调用此作业的boto3代码如下:
The boto3 code that calls this job is below:
glue = boto3.client('glue')
response = glue.start_job_run(
JobName = 'test_metrics',
Arguments = {
'--s3_target_path_key': 's3://my_target',
'--s3_target_path_value': 's3://my_target_value'
}
)
print(response)
在本地计算机上运行boto3代码后,看到一个200
响应,但是Glue错误日志告诉我:
I see a 200
response after I run the boto3 code in my local machine, but Glue error log tells me:
test_metrics.py: error: the following arguments are required: --test_metrics
我想念什么?
推荐答案
您要启动哪个工作?是Spark作业还是Python Shell作业?
Which job you are trying to launch? Spark Job or Python shell job?
如果执行火花作业,则JOB_NAME是必填参数.在Python Shell作业中,根本不需要它.
If spark job, JOB_NAME is mandatory parameter. In Python shell job, it is not needed at all.
因此,在您的python shell作业中,替换
So in your python shell job, replace
args = getResolvedOptions(sys.argv,
['test_metrics',
's3_target_path_key',
's3_target_path_value'])
使用
args = getResolvedOptions(sys.argv,
['s3_target_path_key',
's3_target_path_value'])
这篇关于从boto3调用AWS Glue Pythonshell作业时出现参数错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!