DataflowStopJobOperator
GoogleStops the job with the specified name prefix or Job ID. All jobs with provided name prefix will be stopped. Streaming jobs are drained by default.
Access Instructions
Install the Google provider package into your Airflow environment.
Import the module into your DAG file and instantiate it with your desired params.
Parameters
job_name_prefixName prefix specifying which jobs are to be stopped.
job_idJob ID specifying which jobs are to be stopped.
project_idOptional, the Google Cloud project ID in which to start a job. If set to None or missing, the default project_id from the Google Cloud connection is used.
locationOptional, Job location. If set to None or missing, “us-central1” will be used.
gcp_conn_idThe connection ID to use connecting to Google Cloud.
delegate_toThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
poll_sleepThe time in seconds to sleep between polling Google Cloud Platform for the dataflow job status to confirm it’s stopped.
impersonation_chainOptional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).
drain_pipelineOptional, set to False if want to stop streaming job by canceling it instead of draining. See: https://cloud.google.com/dataflow/docs/guides/stopping-a-pipeline
stop_timeoutwait time in seconds for successful job canceling/draining
Documentation
Stops the job with the specified name prefix or Job ID. All jobs with provided name prefix will be stopped. Streaming jobs are drained by default.
Parameter job_name_prefix
and job_id
are mutually exclusive.
See also
For more details on stopping a pipeline see: https://cloud.google.com/dataflow/docs/guides/stopping-a-pipeline
See also
For more information on how to use this operator, take a look at the guide: Stopping a pipeline