airflow.providers.amazon.aws.sensors.mwaa

Classes

MwaaDagRunSensor

Waits for a DAG Run in an MWAA Environment to complete.

Module Contents

class airflow.providers.amazon.aws.sensors.mwaa.MwaaDagRunSensor(*, external_env_name, external_dag_id, external_dag_run_id, success_states=None, failure_states=None, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), poke_interval=60, max_retries=720, **kwargs)[source]

Bases: airflow.providers.amazon.aws.sensors.base_aws.AwsBaseSensor[airflow.providers.amazon.aws.hooks.mwaa.MwaaHook]

Waits for a DAG Run in an MWAA Environment to complete.

If the DAG Run fails, an AirflowException is thrown.

See also

For more information on how to use this sensor, take a look at the guide: Wait on the state of an AWS MWAA DAG Run

Parameters:
  • external_env_name (str) – The external MWAA environment name that contains the DAG Run you want to wait for (templated)

  • external_dag_id (str) – The DAG ID in the external MWAA environment that contains the DAG Run you want to wait for (templated)

  • external_dag_run_id (str) – The DAG Run ID in the external MWAA environment that you want to wait for (templated)

  • success_states (collections.abc.Collection[str] | None) – Collection of DAG Run states that would make this task marked as successful, default is {airflow.utils.state.DagRunState.SUCCESS} (templated)

  • failure_states (collections.abc.Collection[str] | None) – Collection of DAG Run states that would make this task marked as failed and raise an AirflowException, default is {airflow.utils.state.DagRunState.FAILED} (templated)

  • deferrable (bool) – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state. (default: 720)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
success_states[source]
failure_states[source]
external_env_name[source]
external_dag_id[source]
external_dag_run_id[source]
deferrable = True[source]
poke_interval = 60[source]
max_retries = 720[source]
poke(context)[source]

Override when deriving this class.

execute_complete(context, event=None)[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?