CloudDataFusionCreatePipelineOperator

Google

Creates a Cloud Data Fusion pipeline.

View on GitHub

Last Updated: Feb. 25, 2023

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

pipeline_nameRequiredYour pipeline name.
pipelineRequiredThe pipeline definition. For more information check: https://docs.cdap.io/cdap/current/en/developer-manual/pipelines/developing-pipelines.html#pipeline-configuration-file-format
instance_nameRequiredThe name of the instance.
locationRequiredThe Cloud Data Fusion location in which to handle the request.
namespaceIf your pipeline belongs to a Basic edition instance, the namespace ID is always default. If your pipeline belongs to an Enterprise edition instance, you can create a namespace.
api_versionThe version of the api that will be requested for example ‘v3’.
gcp_conn_idThe connection ID to use when fetching connection info.
delegate_toThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
impersonation_chainOptional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

Documentation

Creates a Cloud Data Fusion pipeline.

See also

For more information on how to use this operator, take a look at the guide: Create a DataFusion pipeline

Was this page helpful?