airflow.providers.amazon.aws.transfers.exasol_to_s3

Transfers data from Exasol database into a S3 Bucket.

Module Contents

Classes

ExasolToS3Operator

Export data from Exasol database to AWS S3 bucket.

class airflow.providers.amazon.aws.transfers.exasol_to_s3.ExasolToS3Operator(*, query_or_table, key, bucket_name=None, replace=False, encrypt=False, gzip=False, acl_policy=None, query_params=None, export_params=None, exasol_conn_id='exasol_default', aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

Export data from Exasol database to AWS S3 bucket.

Parameters
  • query_or_table (str) – the sql statement to be executed or table name to export

  • key (str) – S3 key that will point to the file

  • bucket_name (str | None) – Name of the bucket in which to store the file

  • replace (bool) – A flag to decide whether or not to overwrite the key if it already exists. If replace is False and the key exists, an error will be raised.

  • encrypt (bool) – If True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.

  • gzip (bool) – If True, the file will be compressed locally

  • acl_policy (str | None) – String specifying the canned ACL policy for the file being uploaded to the S3 bucket.

  • query_params (dict | None) – Query parameters passed to underlying export_to_file method of ExaConnection.

  • export_params (dict | None) – Extra parameters passed to underlying export_to_file method of ExaConnection.

template_fields: collections.abc.Sequence[str] = ('query_or_table', 'key', 'bucket_name', 'query_params', 'export_params')[source]
template_fields_renderers[source]
template_ext: collections.abc.Sequence[str] = ('.sql',)[source]
ui_color = '#ededed'[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?