如何在气流中使用 CLI 清除失败的 DAG [英] how to clear failing DAGs using the CLI in airflow
问题描述
我有一些失败的 DAG,比如说从 2 月 1 日到 2 月 20 日.从那天起,他们都成功了.
I have some failing DAGs, let's say from 1st-Feb to 20th-Feb. From that date upword, all of them succeeded.
我尝试使用 cli(而不是使用 Web 进行 20 次用户界面):
I tried to use the cli (instead of doing it twenty times with the Web UI):
airflow clear -f -t * my_dags.my_dag_id
但我有一个奇怪的错误:
But I have a weird error:
airflow: error: unrecognized arguments: airflow-webserver.pid airflow.cfg airflow_variables.json my_dags.my_dag_id
编辑 1:
就像@tobi6 解释的那样,*
确实引起了麻烦.知道这一点,我尝试使用此命令:
Like @tobi6 explained it, the *
was indeed causing troubles.
Knowing that, I tried this command instead:
airflow clear -u -d -f -t ".*" my_dags.my_dag_id
但它只返回失败的任务实例(-f
标志).-d
和 -u
标志似乎不起作用,因为下游和上游的任务实例被忽略(不返回).
but it's only returning failed task instances (-f
flag). -d
and -u
flags don't seem to work because taskinstances downstream and upstream the failed ones are ignored (not returned).
编辑 2:
就像@tobi6 建议的那样,使用 -s
和 -e
允许选择日期范围内的所有 DAG 运行.这是命令:
like @tobi6 suggested, using -s
and -e
permits to select all DAG runs within a date range. Here is the command:
airflow clear -s "2018-04-01 00:00:00" -e "2018-04-01 00:00:00" my_dags.my_dag_id.
然而,在上面的命令中添加 -f
标志只会返回失败的任务实例.是否可以选择日期范围内所有失败 DAG 运行的所有失败任务实例?
However, adding -f
flag to the command above only returns failed task instances. is it possible to select all failed task instances of all failed DAG runs within a date range ?
推荐答案
如果你在 Linux bash 中使用星号 *
,它会自动扩展目录的内容.
If you are using an asterik *
in the Linux bash, it will automatically expand the content of the directory.
这意味着它将用当前工作目录中的所有文件替换星号,然后然后执行您的命令.
Meaning it will replace the asterik with all files in the current working directory and then execute your command.
这将有助于避免自动扩展:
This will help to avoid the automatic expansion:
"airflow clear -f -t * my_dags.my_dag_id"
这篇关于如何在气流中使用 CLI 清除失败的 DAG的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!