SparkKubernetesOperator

Kubernetes

Creates sparkApplication object in kubernetes cluster:

View on GitHub

Last Updated: Mar. 14, 2023

Access Instructions

Install the Kubernetes provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

application_fileRequiredDefines Kubernetes ‘custom_resource_definition’ of ‘sparkApplication’ as either a path to a ‘.yaml’ file, ‘.json’ file, YAML string or JSON string.
namespacekubernetes namespace to put sparkApplication
kubernetes_conn_idThe kubernetes connection id for the to Kubernetes cluster.
api_groupkubernetes api group of sparkApplication
api_versionkubernetes api version of sparkApplication

Documentation

Creates sparkApplication object in kubernetes cluster:

See also

For more detail about Spark Application Object have a look at the reference: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/v1beta2-1.1.0-2.4.5/docs/api-docs.md#sparkapplication

Was this page helpful?