BatchCreateComputeEnvironmentOperator

Amazon

Create an AWS Batch compute environment

View on GitHub

Last Updated: Apr. 12, 2023

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

compute_environment_namethe name of the AWS batch compute environment (templated)
environment_typethe type of the compute-environment
statethe state of the compute-environment
compute_resourcesdetails about the resources managed by the compute-environment (templated). See more details here https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/batch.html#Batch.Client.create_compute_environment
unmanaged_v_cpusthe maximum number of vCPU for an unmanaged compute environment. This parameter is only supported when the type parameter is set to UNMANAGED.
service_rolethe IAM role that allows Batch to make calls to other AWS services on your behalf (templated)
tagsthe tags that you apply to the compute-environment to help you categorize and organize your resources
max_retriesexponential back-off retries, 4200 = 48 hours; polling is only used when waiters is None
status_retriesnumber of HTTP retries to get job status, 10; polling is only used when waiters is None
aws_conn_idconnection id of AWS credentials / region name. If None, credential boto3 strategy will be used.
region_nameregion name to use in AWS Hook. Override the region_name in connection (if provided)

Documentation

Create an AWS Batch compute environment

See also

For more information on how to use this operator, take a look at the guide: Create an AWS Batch compute environment

Was this page helpful?