airflow.providers.google.cloud.operators.bigtable

This module contains Google Cloud Bigtable operators.

Module Contents

Classes

BigtableValidationMixin

Common class for Cloud Bigtable operators for validating required fields.

BigtableCreateInstanceOperator

Creates a new Cloud Bigtable instance.

BigtableUpdateInstanceOperator

Updates an existing Cloud Bigtable instance.

BigtableDeleteInstanceOperator

Deletes the Cloud Bigtable instance, including its clusters and all related tables.

BigtableCreateTableOperator

Creates the table in the Cloud Bigtable instance.

BigtableDeleteTableOperator

Deletes the Cloud Bigtable table.

BigtableUpdateClusterOperator

Updates a Cloud Bigtable cluster.

class airflow.providers.google.cloud.operators.bigtable.BigtableValidationMixin[source]

Common class for Cloud Bigtable operators for validating required fields.

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = [][source]
class airflow.providers.google.cloud.operators.bigtable.BigtableCreateInstanceOperator(*, instance_id, main_cluster_id, main_cluster_zone, project_id=PROVIDE_PROJECT_ID, replica_clusters=None, instance_display_name=None, instance_type=None, instance_labels=None, cluster_nodes=None, cluster_storage_type=None, timeout=None, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Creates a new Cloud Bigtable instance.

If the Cloud Bigtable instance with the given ID exists, the operator does not compare its configuration and immediately succeeds. No changes are made to the existing instance.

For more details about instance creation have a look at the reference: https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance.create

See also

For more information on how to use this operator, take a look at the guide: BigtableCreateInstanceOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance to create.

  • main_cluster_id (str) – The ID for main cluster for the new instance.

  • main_cluster_zone (str) – The zone for main cluster See https://cloud.google.com/bigtable/docs/locations for more details.

  • project_id (str) – Optional, the ID of the Google Cloud project. If set to None or missing, the default project_id from the Google Cloud connection is used.

  • replica_clusters (list[dict[str, str]] | None) – (optional) A list of replica clusters for the new instance. Each cluster dictionary contains an id and a zone. Example: [{“id”: “replica-1”, “zone”: “us-west1-a”}]

  • instance_type (google.cloud.bigtable.enums.Instance.Type | None) – (optional) The type of the instance.

  • instance_display_name (str | None) – (optional) Human-readable name of the instance. Defaults to instance_id.

  • instance_labels (dict | None) – (optional) Dictionary of labels to associate with the instance.

  • cluster_nodes (int | None) – (optional) Number of nodes for cluster.

  • cluster_storage_type (google.cloud.bigtable.enums.StorageType | None) – (optional) The type of storage.

  • timeout (float | None) – (optional) timeout (in seconds) for instance creation. If None is not specified, Operator will wait indefinitely.

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ('instance_id', 'main_cluster_id', 'main_cluster_zone')[source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'main_cluster_id', 'main_cluster_zone', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.google.cloud.operators.bigtable.BigtableUpdateInstanceOperator(*, instance_id, project_id=PROVIDE_PROJECT_ID, instance_display_name=None, instance_type=None, instance_labels=None, timeout=None, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Updates an existing Cloud Bigtable instance.

For more details about instance creation have a look at the reference: https://googleapis.dev/python/bigtable/latest/instance.html#google.cloud.bigtable.instance.Instance.update

See also

For more information on how to use this operator, take a look at the guide: BigtableUpdateInstanceOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance to update.

  • project_id (str) – Optional, the ID of the Google Cloud project. If set to None or missing, the default project_id from the Google Cloud connection is used.

  • instance_display_name (str | None) – (optional) Human-readable name of the instance.

  • instance_type (google.cloud.bigtable.enums.Instance.Type | enum.IntEnum | None) – (optional) The type of the instance.

  • instance_labels (dict | None) – (optional) Dictionary of labels to associate with the instance.

  • timeout (float | None) – (optional) timeout (in seconds) for instance update. If None is not specified, Operator will wait indefinitely.

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ['instance_id'][source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.google.cloud.operators.bigtable.BigtableDeleteInstanceOperator(*, instance_id, project_id=PROVIDE_PROJECT_ID, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Deletes the Cloud Bigtable instance, including its clusters and all related tables.

For more details about deleting instance have a look at the reference: https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance.delete

See also

For more information on how to use this operator, take a look at the guide: BigtableDeleteInstanceOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance to delete.

  • project_id (str) – Optional, the ID of the Google Cloud project. If set to None or missing, the default project_id from the Google Cloud connection is used.

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ('instance_id',)[source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.google.cloud.operators.bigtable.BigtableCreateTableOperator(*, instance_id, table_id, project_id=PROVIDE_PROJECT_ID, initial_split_keys=None, column_families=None, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Creates the table in the Cloud Bigtable instance.

For more details about creating table have a look at the reference: https://googleapis.github.io/google-cloud-python/latest/bigtable/table.html#google.cloud.bigtable.table.Table.create

See also

For more information on how to use this operator, take a look at the guide: BigtableCreateTableOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance that will hold the new table.

  • table_id (str) – The ID of the table to be created.

  • project_id (str) – Optional, the ID of the Google Cloud project. If set to None or missing, the default project_id from the Google Cloud connection is used.

  • initial_split_keys (list | None) – (Optional) list of row keys in bytes that will be used to initially split the table into several tablets.

  • column_families (dict[str, google.cloud.bigtable.column_family.GarbageCollectionRule] | None) – (Optional) A map columns to create. The key is the column_id str and the value is a google.cloud.bigtable.column_family.GarbageCollectionRule

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ('instance_id', 'table_id')[source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'table_id', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.google.cloud.operators.bigtable.BigtableDeleteTableOperator(*, instance_id, table_id, project_id=PROVIDE_PROJECT_ID, app_profile_id=None, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Deletes the Cloud Bigtable table.

For more details about deleting table have a look at the reference: https://googleapis.github.io/google-cloud-python/latest/bigtable/table.html#google.cloud.bigtable.table.Table.delete

See also

For more information on how to use this operator, take a look at the guide: BigtableDeleteTableOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance.

  • table_id (str) – The ID of the table to be deleted.

  • project_id (str) – Optional, the ID of the Google Cloud project. If set to None or missing, the default project_id from the Google Cloud connection is used.

  • app_profile_id (str | None) – Application profile.

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ('instance_id', 'table_id')[source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'table_id', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

class airflow.providers.google.cloud.operators.bigtable.BigtableUpdateClusterOperator(*, instance_id, cluster_id, nodes, project_id=PROVIDE_PROJECT_ID, gcp_conn_id='google_cloud_default', impersonation_chain=None, **kwargs)[source]

Bases: airflow.providers.google.cloud.operators.cloud_base.GoogleCloudBaseOperator, BigtableValidationMixin

Updates a Cloud Bigtable cluster.

For more details about updating a Cloud Bigtable cluster, have a look at the reference: https://googleapis.github.io/google-cloud-python/latest/bigtable/cluster.html#google.cloud.bigtable.cluster.Cluster.update

See also

For more information on how to use this operator, take a look at the guide: BigtableUpdateClusterOperator

Parameters
  • instance_id (str) – The ID of the Cloud Bigtable instance.

  • cluster_id (str) – The ID of the Cloud Bigtable cluster to update.

  • nodes (int) – The desired number of nodes for the Cloud Bigtable cluster.

  • project_id (str) – Optional, the ID of the Google Cloud project.

  • gcp_conn_id (str) – The connection ID to use to connect to Google Cloud.

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

REQUIRED_ATTRIBUTES: collections.abc.Iterable[str] = ('instance_id', 'cluster_id', 'nodes')[source]
template_fields: collections.abc.Sequence[str] = ('project_id', 'instance_id', 'cluster_id', 'nodes', 'impersonation_chain')[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?