使用 s3 连接和 s3 挂钩在 Airflow 上创建 boto3 s3 客户端 [英] creating boto3 s3 client on Airflow with an s3 connection and s3 hook
问题描述
我正在尝试将我的 Python 代码移至 Airflow.我有以下代码片段:
s3_client = boto3.client('s3',region_name="us-west-2",aws_access_key_id=aws_access_key_id,aws_secret_access_key=aws_secret_access_key)
我正在尝试使用 Aiflow 的 s3 钩子和 s3 连接重新创建此 s3_client,但无法在任何文档中找到一种方法,而无需在代码中直接指定 aws_access_key_id 和 aws_secret_access_key.
任何帮助将不胜感激
需要在Admin中定义aws连接->连接或使用 cli(参见
在 Airflow 中,钩子包裹了一个 python 包.因此,如果您的代码使用钩子,就没有理由直接导入 boto3.
I am trying to move my python code to Airflow. I have the following code snippet:
s3_client = boto3.client('s3',
region_name="us-west-2",
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key)
I am trying to recreate this s3_client using Aiflow's s3 hook and s3 connection but cant find a way to do it in any documentation without specifying the aws_access_key_id and the aws_secret_access_key directly in code.
Any help would be appreciated
You need to define aws connection in Admin -> Connections or with cli (see docs).
Once the connection defined you can use it in S3Hook
.
Your connection object can be set as:
Conn Id: <your_choice_of_conn_id_name>
Conn Type: Amazon Web Services
Login: <aws_access_key>
Password: <aws_secret_key>
Extra: {"region_name": "us-west-2"}
In Airflow the hooks wrap a python package. Thus if your code uses hook there shouldn't be a reason to import boto3 directly.
这篇关于使用 s3 连接和 s3 挂钩在 Airflow 上创建 boto3 s3 客户端的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!