airflow.providers.teradata.transfers.s3_to_teradata

Classes

S3ToTeradataOperator

Loads CSV, JSON and Parquet format data from Amazon S3 to Teradata.

Module Contents

class airflow.providers.teradata.transfers.s3_to_teradata.S3ToTeradataOperator(*, s3_source_key, public_bucket=False, teradata_table, aws_conn_id='aws_default', teradata_conn_id='teradata_default', teradata_authorization_name='', **kwargs)[source]

Bases: airflow.models.BaseOperator

Loads CSV, JSON and Parquet format data from Amazon S3 to Teradata.

See also

For more information on how to use this operator, take a look at the guide: S3ToTeradataOperator

Parameters:

Note that s3_source_key and teradata_table are templated, so you can use variables in them if you wish.

template_fields: collections.abc.Sequence[str] = ('s3_source_key', 'teradata_table')[source]
ui_color = '#e07c24'[source]
s3_source_key[source]
public_bucket = False[source]
teradata_table[source]
aws_conn_id = 'aws_default'[source]
teradata_conn_id = 'teradata_default'[source]
teradata_authorization_name = ''[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?