GCSToGCSOperator

Google

Copies objects from a bucket to another, with renaming if requested.

View on GitHub

Last Updated: Apr. 3, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

source_bucketRequiredThe source Google Cloud Storage bucket where the object is. (templated)
source_objectThe source name of the object to copy in the Google cloud storage bucket. (templated) You can use only one wildcard for objects (filenames) within your bucket. The wildcard can appear inside the object name or at the end of the object name. Appending a wildcard to the bucket name is unsupported.
source_objectsA list of source name of the objects to copy in the Google cloud storage bucket. (templated)
destination_bucketThe destination Google Cloud Storage bucket where the object should be. If the destination_bucket is None, it defaults to source_bucket. (templated)
destination_objectThe destination name of the object in the destination Google Cloud Storage bucket. (templated) If a wildcard is supplied in the source_object argument, this is the prefix that will be prepended to the final destination objects’ paths. Note that the source path’s part before the wildcard will be removed; if it needs to be retained it should be appended to destination_object. For example, with prefix foo/* and destination_object blah/, the file foo/baz will be copied to blah/baz; to retain the prefix write the destination_object as e.g. blah/foo, in which case the copied file will be named blah/foo/baz. The same thing applies to source objects inside source_objects.
move_objectWhen move object is True, the object is moved instead of copied to the new location. This is the equivalent of a mv command as opposed to a cp command.
replaceWhether you want to replace existing destination files or not.
delimiterThis is used to restrict the result to only the ‘files’ in a given ‘folder’. If source_objects = [‘foo/bah/’] and delimiter = ‘.avro’, then only the ‘files’ in the folder ‘foo/bah/’ with ‘.avro’ delimiter will be copied to the destination object.
gcp_conn_id(Optional) The connection ID used to connect to Google Cloud.
delegate_toThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
last_modified_timeWhen specified, the objects will be copied or moved, only if they were modified after last_modified_time. If tzinfo has not been set, UTC will be assumed.
maximum_modified_timeWhen specified, the objects will be copied or moved, only if they were modified before maximum_modified_time. If tzinfo has not been set, UTC will be assumed.
is_older_thanWhen specified, the objects will be copied if they are older than the specified time in seconds.
impersonation_chainOptional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).
source_object_requiredWhether you want to raise an exception when the source object doesn’t exist. It doesn’t have any effect when the source objects are folders or patterns.
exact_matchWhen specified, only exact match of the source object (filename) will be copied.

Documentation

Copies objects from a bucket to another, with renaming if requested.

See also

For more information on how to use this operator, take a look at the guide: GCSToGCSOperator

The following Operator would copy a single file named sales/sales-2017/january.avro in the data bucket to the file named copied_sales/2017/january-backup.avro in the data_backup bucket

copy_single_file = GCSToGCSOperator(
task_id='copy_single_file',
source_bucket='data',
source_objects=['sales/sales-2017/january.avro'],
destination_bucket='data_backup',
destination_object='copied_sales/2017/january-backup.avro',
gcp_conn_id=google_cloud_conn_id
)

The following Operator would copy all the Avro files from sales/sales-2017 folder (i.e. with names starting with that prefix) in data bucket to the copied_sales/2017 folder in the data_backup bucket.

copy_files = GCSToGCSOperator(
task_id='copy_files',
source_bucket='data',
source_objects=['sales/sales-2017'],
destination_bucket='data_backup',
destination_object='copied_sales/2017/',
delimiter='.avro'
gcp_conn_id=google_cloud_conn_id
)
Or ::
copy_files = GCSToGCSOperator(
task_id='copy_files',
source_bucket='data',
source_object='sales/sales-2017/*.avro',
destination_bucket='data_backup',
destination_object='copied_sales/2017/',
gcp_conn_id=google_cloud_conn_id
)

The following Operator would move all the Avro files from sales/sales-2017 folder (i.e. with names starting with that prefix) in data bucket to the same folder in the data_backup bucket, deleting the original files in the process.

move_files = GCSToGCSOperator(
task_id='move_files',
source_bucket='data',
source_object='sales/sales-2017/*.avro',
destination_bucket='data_backup',
move_object=True,
gcp_conn_id=google_cloud_conn_id
)
The following Operator would move all the Avro files from sales/sales-2019

and sales/sales-2020` folder in ``data bucket to the same folder in the data_backup bucket, deleting the original files in the process.

move_files = GCSToGCSOperator(
task_id='move_files',
source_bucket='data',
source_objects=['sales/sales-2019/*.avro', 'sales/sales-2020'],
destination_bucket='data_backup',
delimiter='.avro',
move_object=True,
gcp_conn_id=google_cloud_conn_id
)

Was this page helpful?