BranchPythonOperator

Apache Airflow Certified

Allows a workflow to “branch” or follow a path following the execution of this task.

View on GitHub

Last Updated: Apr. 8, 2023

Access Instructions

Install the Apache Airflow provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

python_callableA reference to an object that is callable
op_kwargsa dictionary of keyword arguments that will get unpacked in your function
op_argsa list of positional arguments that will get unpacked when calling your callable
templates_dicta dictionary where the values are templates that will get templated by the Airflow engine sometime between __init__ and execute takes place and are made available in your callable’s context after the template has been applied. (templated)
templates_extsa list of file extensions to resolve while processing templated fields, for examples ['.sql', '.hql']
show_return_value_in_logsa bool value whether to show return_value logs. Defaults to True, which allows return value log output. It can be set to False to prevent log output of return value when you return huge data such as transmission a large amount of XCom to TaskAPI.

Documentation

Allows a workflow to “branch” or follow a path following the execution of this task.

It derives the PythonOperator and expects a Python function that returns a single task_id or list of task_ids to follow. The task_id(s) returned should point to a task directly downstream from {self}. All other “branches” or directly downstream tasks are marked with a state of skipped so that these paths can’t move forward. The skipped states are propagated downstream to allow for the DAG state to fill up and the DAG run’s state to be inferred.

Was this page helpful?