如何并行化Airflow DAG中的类似BashOperator任务但参数不同 [英] how to parallelize similar BashOperator tasks but different parameters in an Airflow DAG

查看:131
本文介绍了如何并行化Airflow DAG中的类似BashOperator任务但参数不同的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在DAG中并行执行以下2个任务在现实世界中,这些任务可能是15或20个任务,其输入参数来自数组,如下所示.

I have parallel execution of 2 tasks below in my DAG In the real world these could be 15 or 20 tasks with the input parameters coming from an array, like below.

fruits = ["apples", "bananas"]

bad_dag = DAG('bad_dag_3', default_args=default_args, schedule_interval=None)

t0=BashOperator(
    task_id="print",
    bash_command='echo "Beginning parallel tasks next..." ',
    dag=bad_dag)

t1=BashOperator(
    task_id="fruit_"+fruits[0],
    params={"fruits": fruits}, 
    bash_command='echo fruit= {{params.fruits[0]}} ',
    dag=bad_dag)

t2=BashOperator(
    task_id="fruit_"+fruits[1],
    params={"fruits": fruits},
    bash_command='echo fruit= {{params.fruits[1]}} ',
    dag=bad_dag)

t0>>[t1, t2]

对我来说,编写此DAG的最佳方法是什么,所以我不必像上面一样重复一遍又一遍地重写同一BashOperator.

Whats the best way for me to write this DAG, so I dont have to re-write the same BashOperator over and over again like I have above.

我不能使用循环,因为如果使用循环,则无法并行化任务.

I cannot use a loop because I cannot parallelize the tasks if I use a loop.

推荐答案

使用以下DAG.这个想法是每个任务的 task_id 应该是唯一的,其余部分将由气流处理.

Use the below DAG. The idea is that the task_id for each task should be unique, airflow will handle the rest.

fruits = ["apples", "bananas"]

bad_dag = DAG('bad_dag_3', default_args=default_args, schedule_interval=None)

t0=BashOperator(
    task_id="print",
    bash_command='echo "Beginning parallel tasks next..." ',
    dag=bad_dag)

for fruit in fruits:
    task_t = BashOperator(
        task_id="fruit_"+fruit,
        params={"fruit": fruit},
        bash_command='echo fruit= {{params.fruit}} ',
        dag=bad_dag)

    t0 >> task_t

这篇关于如何并行化Airflow DAG中的类似BashOperator任务但参数不同的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆