步骤功能中的AWS Batch作业执行结果 [英] AWS Batch Job Execution Results in Step Function

查看:210
本文介绍了步骤功能中的AWS Batch作业执行结果的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是AWS Step Functions和AWS Batch的新手.我正在尝试将AWS Batch Job与Step Function集成在一起.AWS Batch Job执行简单的python脚本,该脚本输出字符串值(高级简化要求).我需要让python脚本输出可用于step函数的下一个状态.我应该如何做到这一点.AWS Batch Job输出不包含python脚本的结果.而是包含所有与容器有关的信息以及输入值.

示例:AWS Batch Job执行输出"Hello World"的python脚本.我需要"Hello World"可用于step函数的下一个状态以执行与其关联的lambda.

解决方案

我能够做到,下面是我的状态机,我使用了用于运行批处理作业的示例项目

I'm newbie to AWS Step Functions and AWS Batch. I'm trying to integrate AWS Batch Job with Step Function. AWS Batch Job executes simple python scripts which output string value (High level simplified requirement) . I need to have the python script output available to the next state of the step function. How I should be able to accomplish this. AWS Batch Job output does not contain results of the python script. instead it contains all the container related information with input values.

Example : AWS Batch Job executes python script which output "Hello World". I need "Hello World" available to the next state of the step function to execute a lambda associated with it.

解决方案

I was able to do it, below is my state machine, I took the sample project for running the batch job Manage a Batch Job (AWS Batch, Amazon SNS) and modified it for two lambdas for passing input/output.

{
  "Comment": "An example of the Amazon States Language for notification on an AWS Batch job completion",
  "StartAt": "Submit Batch Job",
  "TimeoutSeconds": 3600,
  "States": {
    "Submit Batch Job": {
      "Type": "Task",
      "Resource": "arn:aws:states:::batch:submitJob.sync",
      "Parameters": {
        "JobName": "BatchJobNotification",
        "JobQueue": "arn:aws:batch:us-east-1:1234567890:job-queue/BatchJobQueue-737ed10e7ca3bfd",
        "JobDefinition": "arn:aws:batch:us-east-1:1234567890:job-definition/BatchJobDefinition-89c42b1f452ac67:1"
      },
      "Next": "Notify Success",
      "Catch": [
        {
          "ErrorEquals": [
            "States.ALL"
          ],
          "Next": "Notify Failure"
        }
      ]
    },
    "Notify Success": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-east-1:1234567890:function:readcloudwatchlogs",
      "Parameters": {
        "LogStreamName.$": "$.Container.LogStreamName"
      },
      "ResultPath": "$.lambdaOutput",
      "Next": "ConsumeLogs"
    },
    "ConsumeLogs": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-east-1:1234567890:function:consumelogs",
      "Parameters": {
        "randomstring.$": "$.lambdaOutput.logs"
      },
      "End": true
    },
    "Notify Failure": {
      "Type": "Task",
      "Resource": "arn:aws:states:::sns:publish",
      "Parameters": {
        "Message": "Batch job submitted through Step Functions failed",
        "TopicArn": "arn:aws:sns:us-east-1:1234567890:StepFunctionsSample-BatchJobManagement17968f39-e227-47ab-9a75-08a7dcc10c4c-SNSTopic-1GR29R8TUHQY8"
      },
      "End": true
    }
  }
}

The key to read logs was in the Submit Batch Job output which contains LogStreamName, that I passed to my lambda named function:readcloudwatchlogs and read the logs and then eventually passed the read logs to the next function named function:consumelogs. You can see in the attached screenshot consumelogs function printing the logs.


{
  "Attempts": [
    {
      "Container": {
        "ContainerInstanceArn": "arn:aws:ecs:us-east-1:1234567890:container-instance/BatchComputeEnvironment-4a1593ce223b3cf_Batch_7557555f-5606-31a9-86b9-83321eb3e413/6d11fdbfc9eb4f40b0d6b85c396bb243",
        "ExitCode": 0,
        "LogStreamName": "BatchJobDefinition-89c42b1f452ac67/default/2ad955bf59a8418893f53182f0d87b4b",
        "NetworkInterfaces": [],
        "TaskArn": "arn:aws:ecs:us-east-1:1234567890:task/BatchComputeEnvironment-4a1593ce223b3cf_Batch_7557555f-5606-31a9-86b9-83321eb3e413/2ad955bf59a8418893f53182f0d87b4b"
      },
      "StartedAt": 1611329367577,
      "StatusReason": "Essential container in task exited",
      "StoppedAt": 1611329367748
    }
  ],
  "Container": {
    "Command": [
      "echo",
      "Hello world"
    ],
    "ContainerInstanceArn": "arn:aws:ecs:us-east-1:1234567890:container-instance/BatchComputeEnvironment-4a1593ce223b3cf_Batch_7557555f-5606-31a9-86b9-83321eb3e413/6d11fdbfc9eb4f40b0d6b85c396bb243",
    "Environment": [
      {
        "Name": "MANAGED_BY_AWS",
        "Value": "STARTED_BY_STEP_FUNCTIONS"
      }
    ],
    "ExitCode": 0,
    "Image": "137112412989.dkr.ecr.us-east-1.amazonaws.com/amazonlinux:latest",
    "LogStreamName": "BatchJobDefinition-89c42b1f452ac67/default/2ad955bf59a8418893f53182f0d87b4b",
    "TaskArn": "arn:aws:ecs:us-east-1:1234567890:task/BatchComputeEnvironment-4a1593ce223b3cf_Batch_7557555f-5606-31a9-86b9-83321eb3e413/2ad955bf59a8418893f53182f0d87b4b",
..
  },
..
  "Tags": {
    "resourceArn": "arn:aws:batch:us-east-1:1234567890:job/d36ba07a-54f9-4acf-a4b8-3e5413ea5ffc"
  }
}

  • Read Logs Lambda code:

import boto3

client = boto3.client('logs')

def lambda_handler(event, context):
    print(event)
    response = client.get_log_events(
        logGroupName='/aws/batch/job',
        logStreamName=event.get('LogStreamName')
    )
    log = {'logs': response['events'][0]['message']}
    return log

  • Consume Logs Lambda Code

import json

print('Loading function')


def lambda_handler(event, context):
    print(event)

这篇关于步骤功能中的AWS Batch作业执行结果的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆