AzureBlobStorageToGCSOperator

Microsoft Azure

Operator transfers data from Azure Blob Storage to specified bucket in Google Cloud Storage

View on GitHub

Last Updated: Jan. 23, 2023

Access Instructions

Install the Microsoft Azure provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

wasb_conn_idReference to the wasb connection.
gcp_conn_idThe connection ID to use when fetching connection info.
blob_nameRequiredName of the blob
file_pathRequiredPath to the file to download
container_nameRequiredName of the container
bucket_nameRequiredThe bucket to upload to
object_nameRequiredThe object name to set when uploading the file
filenameRequiredThe local file path to the file to be uploaded
gzipRequiredOption to compress local file or file data for upload
delegate_toRequiredThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
impersonation_chainOptional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account.

Documentation

Operator transfers data from Azure Blob Storage to specified bucket in Google Cloud Storage

See also

For more information on how to use this operator, take a look at the guide: Transfer Data from Blob Storage to Google Cloud Storage

Was this page helpful?