airflow.providers.google.cloud.sensors.workflows

Module Contents

Classes

WorkflowExecutionSensor

Checks state of an execution for the given workflow_id and execution_id.

class airflow.providers.google.cloud.sensors.workflows.WorkflowExecutionSensor(*, workflow_id, execution_id, location, project_id=PROVIDE_PROJECT_ID, success_states=None, failure_states=None, retry=DEFAULT, request_timeout=None, metadata=(), gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks state of an execution for the given workflow_id and execution_id.

Parameters
  • workflow_id (str) – Required. The ID of the workflow.

  • execution_id (str) – Required. The ID of the execution.

  • project_id (str) – Required. The ID of the Google Cloud project the cluster belongs to.

  • location (str) – Required. The Cloud Dataproc region in which to handle the request.

  • success_states (set[google.cloud.workflows.executions_v1beta.Execution.State] | None) – Execution states to be considered as successful, by default it’s only SUCCEEDED state

  • failure_states (set[google.cloud.workflows.executions_v1beta.Execution.State] | None) – Execution states to be considered as failures, by default they are FAILED and CANCELLED states.

  • retry (google.api_core.retry.Retry | google.api_core.gapic_v1.method._MethodDefault) – A retry object used to retry requests. If None is specified, requests will not be retried.

  • request_timeout (float | None) – The amount of time, in seconds, to wait for the request to complete. Note that if retry is specified, the timeout applies to each individual attempt.

  • metadata (collections.abc.Sequence[tuple[str, str]]) – Additional metadata that is provided to the method.

template_fields: collections.abc.Sequence[str] = ('location', 'workflow_id', 'execution_id')[source]
poke(context)[source]

Override when deriving this class.

Was this entry helpful?