airflow.providers.amazon.aws.triggers.sagemaker_unified_studio

This module contains the Amazon SageMaker Unified Studio Notebook job trigger.

Classes

SageMakerNotebookJobTrigger

Watches for a notebook job, triggers when it finishes.

Module Contents

class airflow.providers.amazon.aws.triggers.sagemaker_unified_studio.SageMakerNotebookJobTrigger(execution_id, execution_name, waiter_delay, waiter_max_attempts, **kwargs)[source]

Bases: airflow.triggers.base.BaseTrigger

Watches for a notebook job, triggers when it finishes.

Examples:
from airflow.providers.amazon.aws.triggers.sagemaker_unified_studio import SageMakerNotebookJobTrigger

notebook_trigger = SageMakerNotebookJobTrigger(
    execution_id="notebook_job_1234",
    execution_name="notebook_task",
    waiter_delay=10,
    waiter_max_attempts=1440,
)
Parameters:
  • execution_id – A unique, meaningful id for the task.

  • execution_name – A unique, meaningful name for the task.

  • waiter_delay – Interval in seconds to check the notebook execution status.

  • waiter_max_attempts – Number of attempts to wait before returning FAILED.

execution_id[source]
execution_name[source]
waiter_delay[source]
waiter_max_attempts[source]
serialize()[source]

Return the information needed to reconstruct this Trigger.

Returns:

Tuple of (class path, keyword arguments needed to re-instantiate).

async run()[source]

Run the trigger in an asynchronous context.

The trigger should yield an Event whenever it wants to fire off an event, and return None if it is finished. Single-event triggers should thus yield and then immediately return.

If it yields, it is likely that it will be resumed very quickly, but it may not be (e.g. if the workload is being moved to another triggerer process, or a multi-event trigger was being used for a single-event task defer).

In either case, Trigger classes should assume they will be persisted, and then rely on cleanup() being called when they are no longer needed.

Was this entry helpful?