airflow.providers.amazon.aws.sensors.bedrock

Module Contents

Classes

BedrockBaseSensor

General sensor behavior for Amazon Bedrock.

BedrockCustomizeModelCompletedSensor

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

BedrockProvisionModelThroughputCompletedSensor

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

BedrockKnowledgeBaseActiveSensor

Poll the Knowledge Base status until it reaches a terminal state; fails if creation fails.

BedrockIngestionJobSensor

Poll the ingestion job status until it reaches a terminal state; fails if creation fails.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockBaseSensor(deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.providers.amazon.aws.sensors.base_aws.AwsBaseSensor[_GenericBedrockHook]

General sensor behavior for Amazon Bedrock.

Subclasses must implement following methods:
  • get_state()

Subclasses must set the following fields:
  • INTERMEDIATE_STATES

  • FAILURE_STATES

  • SUCCESS_STATES

  • FAILURE_MESSAGE

Parameters

deferrable (bool) – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_STATES: tuple[str, Ellipsis] = ()[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_MESSAGE = ''[source]
aws_hook_class: type[_GenericBedrockHook][source]
ui_color = '#66c3ff'[source]
poke(context, **kwargs)[source]

Override when deriving this class.

abstract get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockCustomizeModelCompletedSensor(*, job_name, max_retries=75, poke_interval=120, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock customize model job

Parameters
  • job_name (str) – The name of the Bedrock model customization job.

  • deferrable – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 120)

  • max_retries (int) – Number of times before returning the current state. (default: 75)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('InProgress',)[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed', 'Stopping', 'Stopped')[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('Completed',)[source]
FAILURE_MESSAGE = 'Bedrock model customization job sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockProvisionModelThroughputCompletedSensor(*, model_id, poke_interval=60, max_retries=20, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock provision model throughput job

Parameters
  • model_id (str) – The ARN or name of the provisioned throughput.

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state (default: 20)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('Creating', 'Updating')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed',)[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('InService',)[source]
FAILURE_MESSAGE = 'Bedrock provision model throughput sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockKnowledgeBaseActiveSensor(*, knowledge_base_id, poke_interval=5, max_retries=24, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockAgentHook]

Poll the Knowledge Base status until it reaches a terminal state; fails if creation fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock Knowledge Base

Parameters
  • knowledge_base_id (str) – The unique identifier of the knowledge base for which to get information. (templated)

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 5)

  • max_retries (int) – Number of times before returning the current state (default: 24)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('CREATING', 'UPDATING')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('DELETING', 'FAILED')[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('ACTIVE',)[source]
FAILURE_MESSAGE = 'Bedrock Knowledge Base Active sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockIngestionJobSensor(*, knowledge_base_id, data_source_id, ingestion_job_id, poke_interval=60, max_retries=10, **kwargs)[source]

Bases: BedrockBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockAgentHook]

Poll the ingestion job status until it reaches a terminal state; fails if creation fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock ingestion job to finish

Parameters
  • knowledge_base_id (str) – The unique identifier of the knowledge base for which to get information. (templated)

  • data_source_id (str) – The unique identifier of the data source in the ingestion job. (templated)

  • ingestion_job_id (str) – The unique identifier of the ingestion job. (templated)

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state (default: 10)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('STARTING', 'IN_PROGRESS')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('FAILED',)[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('COMPLETE',)[source]
FAILURE_MESSAGE = 'Bedrock ingestion job sensor failed.'[source]
aws_hook_class[source]
template_fields: collections.abc.Sequence[str][source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?