airflow.providers.cncf.kubernetes.sensors.spark_kubernetes

Classes

SparkKubernetesSensor

Checks sparkApplication object in kubernetes cluster.

Module Contents

class airflow.providers.cncf.kubernetes.sensors.spark_kubernetes.SparkKubernetesSensor(*, application_name, attach_log=False, namespace=None, container_name='spark-kubernetes-driver', kubernetes_conn_id='kubernetes_default', api_group='sparkoperator.k8s.io', api_version='v1beta2', **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks sparkApplication object in kubernetes cluster.

See also

For more detail about Spark Application Object have a look at the reference: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/v1beta2-1.1.0-2.4.5/docs/api-docs.md#sparkapplication

Parameters:
  • application_name (str) – spark Application resource name

  • namespace (str | None) – the kubernetes namespace where the sparkApplication reside in

  • container_name (str) – the kubernetes container name where the sparkApplication reside in

  • kubernetes_conn_id (str) – The kubernetes connection to Kubernetes cluster.

  • attach_log (bool) – determines whether logs for driver pod should be appended to the sensor log

  • api_group (str) – kubernetes api group of sparkApplication

  • api_version (str) – kubernetes api version of sparkApplication

template_fields: collections.abc.Sequence[str] = ('application_name', 'namespace')[source]
FAILURE_STATES = ('FAILED', 'UNKNOWN')[source]
SUCCESS_STATES = ('COMPLETED',)[source]
application_name[source]
attach_log = False[source]
namespace = None[source]
container_name = 'spark-kubernetes-driver'[source]
kubernetes_conn_id = 'kubernetes_default'[source]
api_group = 'sparkoperator.k8s.io'[source]
api_version = 'v1beta2'[source]
property hook: airflow.providers.cncf.kubernetes.hooks.kubernetes.KubernetesHook[source]
poke(context)[source]

Override when deriving this class.

Was this entry helpful?