Apache Flink

Creates flinkDeployment object in kubernetes cluster:

View on GitHub

Last Updated: Dec. 21, 2022

Access Instructions

Install the Apache Flink provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


application_fileRequiredDefines Kubernetes ‘custom_resource_definition’ of ‘flinkDeployment’ as either a path to a ‘.yaml’ file, ‘.json’ file, YAML string or JSON string.
namespacekubernetes namespace to put flinkDeployment
kubernetes_conn_idThe kubernetes connection id for the to Kubernetes cluster.
api_groupkubernetes api group of flinkDeployment
api_versionkubernetes api version of flinkDeployment
in_clusterrun kubernetes client with in_cluster configuration.
cluster_contextcontext that points to kubernetes cluster. Ignored when in_cluster is True. If None, current-context is used.
config_fileThe path to the Kubernetes config file. (templated) If not specified, default value is ~/.kube/config


Creates flinkDeployment object in kubernetes cluster:

See also

For more information on how to use this operator, take a look at the guide: FlinkKubernetesOperator

See also

For more detail about Flink Deployment Object have a look at the reference: https://nightlies.apache.org/flink/flink-kubernetes-operator-docs-main/docs/custom-resource/reference/#flinkdeployment

Was this page helpful?