如何从通过SSHExecuteOperator推送的Airflow XCom检索值 [英] How to retrieve a value from Airflow XCom pushed via SSHExecuteOperator

查看:274
本文介绍了如何从通过SSHExecuteOperator推送的Airflow XCom检索值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下带有两个SSHExecuteOperator任务的DAG。第一个任务执行一个存储过程,该过程返回一个参数。第二个任务需要此参数作为输入。

I have the following DAG with two SSHExecuteOperator tasks. The first task executes a stored procedure which returns a parameter. The second task needs this parameter as an input.

请解释一下如何从task1中推送的XCom中提取值,以便在task2中使用它。

Could please explain how to pull the value from the XCom pushed in task1, in order to use it in task2?

from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.hooks.ssh_hook import SSHHook
from airflow.contrib.operators.ssh_execute_operator import SSHExecuteOperator
from airflow.models import Variable

default_args = {
  'owner': 'airflow',
  'depends_on_past': False,
  'start_date': datetime.now(),
  'email': ['my@email.com'],
  'email_on_failure': True,
  'retries': 0
}

#server must be changed to point to the correct environment, to do so update DataQualitySSHHook variable in Airflow admin
DataQualitySSHHook = Variable.get('DataQualitySSHHook')
print('Connecting to: ' + DataQualitySSHHook)
sshHookEtl = SSHHook(conn_id=DataQualitySSHHook)
sshHookEtl.no_host_key_check = True 

#create dag
dag = DAG(
  'ed_data_quality_test-v0.0.3', #update version whenever you change something
  default_args=default_args,
  schedule_interval="0 0 * * *",
  dagrun_timeout=timedelta(hours=24),
  max_active_runs=1)

#create tasks
task1 = SSHExecuteOperator(
  task_id='run_remote_sp_audit_batch_register',
  bash_command="bash /opt/scripts/data_quality/EXEC_SP_AUDIT_BATCH.sh 'ED_DATA_QUALITY_MANUAL' 'REGISTER' '1900-01-01 00:00:00.000000' '2999-12-31 00:00:00.000000' ", #keep the space at the end
  ssh_hook=sshHookEtl,
  xcom_push=True,
  retries=0,
  dag=dag)

task2 = SSHExecuteOperator(
  task_id='run_remote_sp_audit_module_session_start',
  bash_command="echo {{ ti.xcom_pull(task_ids='run_remote_sp_audit_batch_register') }}",
  ssh_hook=sshHookEtl,
  retries=0,
  dag=dag)

#create dependencies
task1.set_downstream(task2)


推荐答案

所以我找到的解决方案是当task1执行shell脚本时,必须确保要使用的参数被XCom变量捕获是脚本(使用echo)最后打印的内容。

So the solution I have found is when task1 executes the shell script, you have to make sure the parameter you want to be captured by the XCom variable is the last thing printed by your script (using echo).

然后,我能够使用以下代码片段检索XCom变量值:

Then I was able to retrieve the XCom variable value with the following code snippet:

{{task_instance.xcom_pull(task_ids ='run_remote_sp_audit_batch_register')}}

这篇关于如何从通过SSHExecuteOperator推送的Airflow XCom检索值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆