通过CLI访问传递给Airflow的配置参数 [英] Accessing configuration parameters passed to Airflow through CLI
问题描述
我试图在触发dag运行时将以下配置参数传递给Airflow CLI。以下是我正在使用的trigger_dag命令。
I am trying to pass the following configuration parameters to Airflow CLI while triggering a dag run. Following is the trigger_dag command I am using.
airflow trigger_dag -c '{"account_list":"[1,2,3,4,5]", "start_date":"2016-04-25"}' insights_assembly_9900
我的问题是,如何在dag运行中访问在运算符内部传递的con参数。
My problem is that how can I access the con parameters passed inside an operator in the dag run.
推荐答案
这可能是一个延续 devj
提供的答案。
This is probably a continuation of the answer provided by devj
.
-
在
airflow.cfg
处,以下属性应设置为true :
dag_run_conf_overrides_params = True
At
airflow.cfg
the following property should be set to true:dag_run_conf_overrides_params=True
在定义PythonOperator时,传递以下参数 provide_context = True
。例如:
While defining the PythonOperator, pass the following argument provide_context=True
. For example:
get_row_count_operator = PythonOperator(task_id='get_row_count', python_callable=do_work, dag=dag, provide_context=True)
- 定义python可调用对象(请注意
** kwargs
)的使用:
- Define the python callable (Note the use of
**kwargs
):
def do_work(**kwargs):
table_name = kwargs['dag_run'].conf.get('table_name')
# Rest of the code
- 从命令行调用dag:
airflow trigger_dag read_hive --conf '{"table_name":"my_table_name"}'
我有发现此讨论对您有所帮助。
I have found this discussion to be helpful.
这篇关于通过CLI访问传递给Airflow的配置参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!