airflow.providers.amazon.aws.operators.s3_tables

Amazon S3 Tables operators.

Classes

S3TablesCreateTableOperator

Create a new table in an Amazon S3 Tables namespace.

S3TablesDeleteTableOperator

Delete a table from an Amazon S3 Tables namespace.

Module Contents

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesCreateTableOperator(*, table_bucket_arn, namespace, table_name, format='ICEBERG', metadata=None, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]

Create a new table in an Amazon S3 Tables namespace.

See also

For more information on how to use this operator, take a look at the guide: Create an Amazon S3 Table

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket to create the table in. (templated)

  • namespace (str) – The namespace to associate with the table. (templated)

  • table_name (str) – The name of the table. (templated)

  • format (str) – The table format. (templated) Currently only ICEBERG is supported.

  • metadata (dict[str, Any] | None) – Optional Iceberg schema metadata. (templated) Example: {"iceberg": {"schema": {"fields": [{"name": "id", "type": "int", "required": True}]}}}

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

template_fields: collections.abc.Sequence[str][source]
template_fields_renderers[source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
table_name[source]
format = 'ICEBERG'[source]
metadata = None[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.amazon.aws.operators.s3_tables.S3TablesDeleteTableOperator(*, table_bucket_arn, namespace, table_name, version_token=None, **kwargs)[source]

Bases: airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator[airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook]

Delete a table from an Amazon S3 Tables namespace.

See also

For more information on how to use this operator, take a look at the guide: Delete a Table

Parameters:
  • table_bucket_arn (str) – The ARN of the table bucket containing the table. (templated)

  • namespace (str) – The namespace of the table. (templated)

  • table_name (str) – The name of the table to delete. (templated)

  • version_token (str | None) – Optional version token for optimistic concurrency. (templated)

template_fields: collections.abc.Sequence[str][source]
aws_hook_class[source]
table_bucket_arn[source]
namespace[source]
table_name[source]
version_token = None[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?