GCSTimeSpanFileTransformOperator

Google

Determines a list of objects that were added or modified at a GCS source location during a specific time-span, copies them to a temporary location on the local file system, runs a transform on this file as specified by the transformation script and uploads the output to the destination bucket.

View on GitHub

Last Updated: Feb. 25, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

source_bucketRequiredThe bucket to fetch data from. (templated)
source_prefixRequiredPrefix string which filters objects whose name begin with this prefix. Can interpolate execution date and time components. (templated)
source_gcp_conn_idRequiredThe connection ID to use connecting to Google Cloud to download files to be processed.
source_impersonation_chainOptional service account to impersonate using short-term credentials (to download files to be processed), or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).
destination_bucketRequiredThe bucket to write data to. (templated)
destination_prefixRequiredPrefix string for the upload location. Can interpolate execution date and time components. (templated)
destination_gcp_conn_idRequiredThe connection ID to use connecting to Google Cloud to upload processed files.
destination_impersonation_chainOptional service account to impersonate using short-term credentials (to upload processed files), or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).
transform_scriptRequiredlocation of the executable transformation script or list of arguments passed to subprocess ex. [‘python’, ‘script.py’, 10]. (templated)
chunk_sizeThe size of a chunk of data when downloading or uploading (in bytes). This must be a multiple of 256 KB (per the google clout storage API specification).
download_continue_on_failWith this set to true, if a download fails the task does not error out but will still continue.
upload_chunk_sizeThe size of a chunk of data when uploading (in bytes). This must be a multiple of 256 KB (per the google clout storage API specification).
upload_continue_on_failWith this set to true, if an upload fails the task does not error out but will still continue.
upload_num_attemptsNumber of attempts to try to upload a single file.

Documentation

Determines a list of objects that were added or modified at a GCS source location during a specific time-span, copies them to a temporary location on the local file system, runs a transform on this file as specified by the transformation script and uploads the output to the destination bucket.

See also

For more information on how to use this operator, take a look at the guide: GCSTimeSpanFileTransformOperator

The locations of the source and the destination files in the local filesystem is provided as an first and second arguments to the transformation script. The time-span is passed to the transform script as third and fourth argument as UTC ISO 8601 string.

The transformation script is expected to read the data from source, transform it and write the output to the local destination file.

Was this page helpful?