BashOperator

Apache Airflow Certified

Execute a Bash script, command or set of commands.

View on GitHub

Last Updated: Jan. 14, 2023

Access Instructions

Install the Apache Airflow provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

bash_commandRequiredThe command, set of commands or reference to a bash script (must be ‘.sh’) to be executed. (templated)
envIf env is not None, it must be a dict that defines the environment variables for the new process; these are used instead of inheriting the current process environment, which is the default behavior. (templated)
append_envIf False(default) uses the environment variables passed in env params and does not inherit the current process environment. If True, inherits the environment variables from current passes and then environment variable passed by the user will either update the existing inherited environment variables or the new variables gets appended to it
output_encodingOutput encoding of bash command
skip_exit_codeIf task exits with this exit code, leave the task in skipped state (default: 99). If set to None, any non-zero exit code will be treated as a failure.
cwdWorking directory to execute the command in. If None (default), the command is run in a temporary directory.

Documentation

Execute a Bash script, command or set of commands.

See also

For more information on how to use this operator, take a look at the guide: BashOperator

If BaseOperator.do_xcom_push is True, the last line written to stdout will also be pushed to an XCom when the bash command completes

Airflow will evaluate the exit code of the bash command. In general, a non-zero exit code will result in task failure and zero will result in task success. Exit code 99 (or another set in skip_exit_code) will throw an airflow.exceptions.AirflowSkipException, which will leave the task in skipped state. You can have all non-zero exit codes be treated as a failure by setting skip_exit_code=None.

Exit code

Behavior

0

success

skip_exit_code (default: 99)

raise airflow.exceptions.AirflowSkipException

otherwise

raise airflow.exceptions.AirflowException

Note

Airflow will not recognize a non-zero exit code unless the whole shell exit with a non-zero exit code. This can be an issue if the non-zero exit arises from a sub-command. The easiest way of addressing this is to prefix the command with set -e;

Example: .. code-block:: python

bash_command = “set -e; python3 script.py ‘{{ next_execution_date }}’”

Note

Add a space after the script name when directly calling a .sh script with the bash_command argument – for example bash_command="my_script.sh ". This is because Airflow tries to apply load this file and process it as a Jinja template to it ends with .sh, which will likely not be what most users want.

Warning

Care should be taken with “user” input or when using Jinja templates in the bash_command, as this bash operator does not perform any escaping or sanitization of the command.

This applies mostly to using “dag_run” conf, as that can be submitted via users in the Web UI. Most of the default template variables are not at risk.

For example, do not do this:

bash_task = BashOperator(
task_id="bash_task",
bash_command='echo "Here is the message: \'{{ dag_run.conf["message"] if dag_run else "" }}\'"',
)

Instead, you should pass this via the env kwarg and use double-quotes inside the bash_command, as below:

bash_task = BashOperator(
task_id="bash_task",
bash_command="echo \"here is the message: '$message'\"",
env={"message": '{{ dag_run.conf["message"] if dag_run else "" }}'},
)

Example DAGs

Advanced dbt Tutorial

An advanced example DAG from the Astronomer tutorial featuring the execution of dbt commands in Airflow.

Apache Airflow
Data Processing
Basic dbt Tutorial

A basic example DAG from the Astronomer tutorial featuring the execution of dbt commands in Airflow.

Apache Airflow
Data Processing
bash_operator

Example DAG demonstrating the usage of the BashOperator.

Apache Airflow
Airflow Fundamentals
complex

Example Airflow DAG that shows the complex DAG structure.

Apache Airflow
Airflow Fundamentals
task_group

Example DAG demonstrating the usage of the TaskGroup.

Apache Airflow
Airflow Fundamentals
trigger_target_dag

Example usage of the TriggerDagRunOperator. This example holds 2 DAGs: 1. 1st DAG (example_trigger_controller_dag) holds a TriggerDagRunOperator, which will trigger the 2nd DAG 2. 2nd DAG (example_tri…

Apache Airflow
Airflow Fundamentals
passing_params_via_test_command

Example DAG demonstrating the usage of the params arguments in templated arguments.

Apache Airflow
Airflow Fundamentals
xcomargs

Example DAG demonstrating the usage of the XComArgs.

Apache Airflow
Airflow Fundamentals
tutorial

### Tutorial Documentation Documentation that goes along with the Airflow tutorial located [here](https://airflow.apache.org/tutorial.html)

Apache Airflow
Airflow Fundamentals
Dynamic dbt Data Pipeline

An example of a dbt pipeline which generates tasks dynamically from a ``manifest.json`` file.

Apache Airflow
Data ProcessingETL/ELT
sample_xlsx_to_parquet

### Sample DAG

XLSX
Pipeline Alerts and Notifications using Microsoft Teams

Example DAG demonstrating how to implement Microsoft Teams alerting and notifications.

Apache Airflow HTTP
Alerts/Notifications
Pipeline Alerts and Notifications for Multiple Microsoft Teams Channels

Example DAG demonstrating how to implement alerting and notifications for multiple Microsoft Teams channels.

Apache Airflow HTTP
Alerts/Notifications
Pipeline Alerts and Notifications using Slack

Example DAG demonstrating how to implement alerting and notifications in Slack.

Apache Airflow Slack
Alerts/Notifications
Pipeline Alerts and Notifications for Multiple Slack Channels

Example DAG demonstrating how to implement alerting and notifications for multiple Slack channels

Apache Airflow SlackHTTP
Alerts/Notifications
Rerun dbt Models from Failure

This example shows how you can use the new `dbt build +=` command to rerun a model from the point of failure.

Apache Airflow Hightouch
ETL/ELT
Datasets Downstream 1

This is an example of a DAG that consumes two datasets.

Apache Airflow
Airflow FundamentalsETL/ELT
Datasets Upstream 1

This is an example of a DAG with dataset producer tasks.

Apache Airflow
Airflow FundamentalsETL/ELT
Datasets Upstream 2

This is an example of a DAG with dataset producer tasks.

Apache Airflow
Airflow FundamentalsETL/ELT

Was this page helpful?