BatchOperatorAsync

Astronomer Providers

Execute a job asynchronously on AWS Batch

Access Instructions

Install the Astronomer Providers provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

job_namethe name for the job that will run on AWS Batch (templated)
job_definitionthe job definition name on AWS Batch
job_queuethe queue name on AWS Batch
overridesthe containerOverrides parameter for boto3 (templated)
array_propertiesthe arrayProperties parameter for boto3
parametersthe parameters for boto3 (templated)
job_idthe job ID, usually unknown (None) until the submit_job operation gets the jobId defined by AWS Batch
waitersan BatchWaiters object (see note below); if None, polling is used with max_retries and status_retries.
max_retriesexponential back-off retries, 4200 = 48 hours; polling is only used when waiters is None
status_retriesnumber of HTTP retries to get job status, 10; polling is only used when waiters is None
aws_conn_idconnection id of AWS credentials / region name. If None, credential boto3 strategy will be used.
region_nameregion name to use in AWS Hook. Override the region_name in connection (if provided)
tagscollection of tags to apply to the AWS Batch job submission if None, no tags are submitted

Documentation

Execute a job asynchronously on AWS Batch

Note

Any custom waiters must return a waiter for these calls:

waiter = waiters.get_waiter("JobExists")
waiter = waiters.get_waiter("JobRunning")
waiter = waiters.get_waiter("JobComplete")

Was this page helpful?