调用 BashOperator 时出错:Bash 命令失败 [英] Error calling BashOperator: Bash command failed

查看:23
本文介绍了调用 BashOperator 时出错:Bash 命令失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我的 dag 文件和 BashOperator 任务:

Here are my dag file and BashOperator task:

my_dag = { 
dag_id = 'my_dag',
start_date = datetime(year=2017, month=3, day=28),
schedule_interval='01***',
}

my_bash_task = BashOperator(
task_id="my_bash_task",
bash_command=bash_command,
dag=my_dag)

bash_command = "/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh ""

按照这个 answer 我什至在 bash 之后给了一个空格文件以避免 TemplateNotFound 错误.但是在运行这个任务时给了我这个错误:airflow.exceptions.AirflowException:Bash 命令失败.

Following this answer I even gave a space after the bash file to avoid TemplateNotFound error. But while running this task gave me this error:airflow.exceptions.AirflowException: Bash command failed.

bash_command 文件内容为:

bash_command file contents are:

#!/bin/bash
DATABASE=db_name
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/location/$FILE db_name
unset PGPASSWORD

但是,不是将 bash_command 指向 bash 文件,而是在多行字符串中写入命令:

However instead of pointing the bash_command to the bash file writing the command in multi line string works:

bash_command = """
DATABASE=db_name
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/location/$FILE db_name
unset PGPASSWORD
"""

因此,我假设错误不是因为 bash 命令.我什至尝试用 #!/bin/sh 替换 bash 文件中的 #!/bin/bash,这也不起作用.

Because of this I am assuming that the error is not because bash commands. I even tried replacing #!/bin/bash in the bash file with #!/bin/sh, that did not work either.

我从终结者运行 sh db_back_up_bash.sh 并且它工作正常.

I ran sh db_back_up_bash.sh from terminator and it works fine.

更新实际代码:

bash_file_location_to_backup_db = '{{"/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh"}}'
# bash_file_location_to_backup_db = "/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh "
bash_command = """
DATABASE=ksaprice_scraping
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/ksaprice/$FILE ksaprice_scraping
unset PGPASSWORD
"""

backup_scraped_db_in_dropbox_task = BashOperator(
    task_id="backup_scraped_db_in_dropbox_task",
    # bash_command=bash_command,# this works fine
    bash_command=bash_file_location_to_backup_db,#this give error :airflow.exceptions.AirflowException: Bash command failed
    dag=dag_crawl
)

错误跟踪:

[2017-04-11 20:02:14,905] {bash_operator.py:90} INFO - Output:
2017-04-11 20:02:14,905 | INFO| root : Output:
[2017-04-11 20:02:14,906] {bash_operator.py:94} INFO - /tmp/airflowtmp7FffJ2/backup_scraped_db_in_dropbox_taskQ6IVxm: line 1: /home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh: Permission denied
2017-04-11 20:02:14,906 | INFO| root : /tmp/airflowtmp7FffJ2/backup_scraped_db_in_dropbox_taskQ6IVxm: line 1: /home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh: Permission denied
[2017-04-11 20:02:14,906] {bash_operator.py:97} INFO - Command exited with return code 126
2017-04-11 20:02:14,906 | INFO| root : Command exited with return code 126
[2017-04-11 20:02:14,906] {models.py:1417} ERROR - Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
AirflowException: Bash command failed
2017-04-11 20:02:14,906 | ERROR| root : Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
AirflowException: Bash command failed
[2017-04-11 20:02:14,907] {models.py:1441} INFO - Marking task as FAILED.
2017-04-11 20:02:14,907 | INFO| root : Marking task as FAILED.
[2017-04-11 20:02:14,947] {models.py:1462} ERROR - Bash command failed
2017-04-11 20:02:14,947 | ERROR| root : Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/bin/airflow", line 28, in <module>
    args.func(args)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 585, in test
    ti.run(ignore_task_deps=True, ignore_ti_state=True, test_mode=True)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/utils/db.py", line 53, in wrapper
    result = func(*args, **kwargs)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
airflow.exceptions.AirflowException: Bash command faile

推荐答案

我认为这是气流中的一个错误,jinja 不应该期望 .sh 文件包含 BashOperator 中的模板信息.

I consider this a bug in airflow, jinja should not expect .sh files to contain template information in a BashOperator.

我通过将命令放入 Jinja 可以正确解释的格式来解决这个问题:

I got around it by putting the command into a format Jinja will interpret correctly:

bash_command = '{{"/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh"}}'

这篇关于调用 BashOperator 时出错:Bash 命令失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆