BigQueryDataTransferServiceTransferRunSensor

Google

Waits for Data Transfer Service run to complete.

View on GitHub

Last Updated: Sep. 13, 2022

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

expected_statusesThe expected state of the operation. See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferOperations#Status
run_idRequiredID of the transfer run.
transfer_config_idRequiredID of transfer config to be used.
project_idThe BigQuery project id where the transfer configuration should be created. If set to None or missing, the default project_id from the Google Cloud connection is used.
retryA retry object used to retry requests. If None is specified, requests will not be retried.
request_timeoutThe amount of time, in seconds, to wait for the request to complete. Note that if retry is specified, the timeout applies to each individual attempt.
metadataAdditional metadata that is provided to the method.
impersonation_chainOptional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

Documentation

Waits for Data Transfer Service run to complete.

See also

For more information on how to use this sensor, take a look at the guide: Manually starting transfer runs

Was this page helpful?