BigQueryTablePartitionExistenceTrigger

Google

Initialize the BigQuery Table Partition Existence Trigger with needed parameters :param partition_id: The name of the partition to check the existence of. :param project_id: Google Cloud Project where the job is running :param dataset_id: The dataset ID of the requested table. :param table_id: The table ID of the requested table. :param gcp_conn_id: Reference to google cloud connection id :param hook_params: params for hook :param poll_interval: polling period in seconds to check for the status

View on GitHub

Last Updated: Feb. 27, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

project_idGoogle Cloud Project where the job is running
dataset_idThe dataset ID of the requested table.
table_idThe table ID of the requested table.
gcp_conn_idReference to google cloud connection id
hook_paramsparams for hook
poll_intervalpolling period in seconds to check for the status

Documentation

Initialize the BigQuery Table Partition Existence Trigger with needed parameters :param partition_id: The name of the partition to check the existence of. :param project_id: Google Cloud Project where the job is running :param dataset_id: The dataset ID of the requested table. :param table_id: The table ID of the requested table. :param gcp_conn_id: Reference to google cloud connection id :param hook_params: params for hook :param poll_interval: polling period in seconds to check for the status

Was this page helpful?