CloudDataTransferServiceCreateJobOperator

Google

Creates a transfer job that runs periodically.

View on GitHub

Last Updated: Feb. 25, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

bodyRequired(Required) The request body, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs#TransferJob With three additional improvements: dates can be given in the form datetime.date times can be given in the form datetime.time credentials to Amazon Web Service should be stored in the connection and indicated by the aws_conn_id parameter
aws_conn_idThe connection ID used to retrieve credentials to Amazon Web Service.
gcp_conn_idThe connection ID used to connect to Google Cloud.
api_versionAPI version used (e.g. v1).
google_impersonation_chainOptional Google service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

Documentation

Creates a transfer job that runs periodically.

Warning

This operator is NOT idempotent in the following cases:

  • name is not passed in body param

  • transfer job name has been soft deleted. In this case, each new task will receive a unique suffix

If you run it many times, many transfer jobs will be created in the Google Cloud.

See also

For more information on how to use this operator, take a look at the guide: CloudDataTransferServiceCreateJobOperator

Was this page helpful?