EcsRunTaskOperator

Amazon

Execute a task on AWS ECS (Elastic Container Service)

View on GitHub

Last Updated: Apr. 12, 2023

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

task_definitionRequiredthe task definition name on Elastic Container Service
clusterRequiredthe cluster name on Elastic Container Service
overridesRequiredthe same parameter that boto3 will receive (templated): https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.run_task
aws_conn_idconnection id of AWS credentials / region name. If None, credential boto3 strategy will be used (https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html).
regionregion name to use in AWS Hook. Override the region in connection (if provided)
launch_typethe launch type on which to run your task (‘EC2’, ‘EXTERNAL’, or ‘FARGATE’)
capacity_provider_strategythe capacity provider strategy to use for the task. When capacity_provider_strategy is specified, the launch_type parameter is omitted. If no capacity_provider_strategy or launch_type is specified, the default capacity provider strategy for the cluster is used.
groupthe name of the task group associated with the task
placement_constraintsan array of placement constraint objects to use for the task
placement_strategyan array of placement strategy objects to use for the task
platform_versionthe platform version on which your task is running
network_configurationthe network configuration for the task
tagsa dictionary of tags in the form of {‘tagKey’: ‘tagValue’}.
awslogs_groupthe CloudWatch group where your ECS container logs are stored. Only required if you want logs to be shown in the Airflow UI after your job has finished.
awslogs_regionthe region in which your CloudWatch logs are stored. If None, this is the same as the region parameter. If that is also None, this is the default AWS region based on your connection settings.
awslogs_stream_prefixthe stream prefix that is used for the CloudWatch logs. This is usually based on some custom name combined with the name of the container. Only required if you want logs to be shown in the Airflow UI after your job has finished.
awslogs_fetch_intervalthe interval that the ECS task log fetcher should wait in between each Cloudwatch logs fetches.
quota_retryConfig if and how to retry the launch of a new ECS task, to handle transient errors.
reattachIf set to True, will check if the task previously launched by the task_instance is already running. If so, the operator will attach to it instead of starting a new task. This is to avoid relaunching a new task when the connection drops between Airflow and ECS while the task is running (when the Airflow worker is restarted for example).
number_logs_exceptionNumber of lines from the last Cloudwatch logs to return in the AirflowException if an ECS task is stopped (to receive Airflow alerts with the logs of what failed in the code running in ECS).
wait_for_completionIf True, waits for creation of the cluster to complete. (default: True)

Documentation

Execute a task on AWS ECS (Elastic Container Service)

See also

For more information on how to use this operator, take a look at the guide: Run a Task Definition

Was this page helpful?