BatchOperator
AmazonExecute a job on AWS Batch
Access Instructions
Install the Amazon provider package into your Airflow environment.
Import the module into your DAG file and instantiate it with your desired params.
Parameters
job_nameRequiredthe name for the job that will run on AWS Batch (templated)
job_definitionRequiredthe job definition name on AWS Batch
job_queueRequiredthe queue name on AWS Batch
overridesRequiredthe containerOverrides parameter for boto3 (templated)
array_propertiesthe arrayProperties parameter for boto3
parametersthe parameters for boto3 (templated)
job_idthe job ID, usually unknown (None) until the submit_job operation gets the jobId defined by AWS Batch
waitersan BatchWaiters object (see note below); if None, polling is used with max_retries and status_retries.
max_retriesexponential back-off retries, 4200 = 48 hours; polling is only used when waiters is None
status_retriesnumber of HTTP retries to get job status, 10; polling is only used when waiters is None
aws_conn_idconnection id of AWS credentials / region name. If None, credential boto3 strategy will be used.
region_nameregion name to use in AWS Hook. Override the region_name in connection (if provided)
tagscollection of tags to apply to the AWS Batch job submission if None, no tags are submitted
Documentation
Execute a job on AWS Batch
See also
For more information on how to use this operator, take a look at the guide: Submit a new AWS Batch job
Note
Any custom waiters must return a waiter for these calls: .. code-block:: python
waiter = waiters.get_waiter(“JobExists”) waiter = waiters.get_waiter(“JobRunning”) waiter = waiters.get_waiter(“JobComplete”)