airflow.providers.amazon.aws.transfers.s3_to_sftp

Module Contents

Classes

S3ToSFTPOperator

This operator enables the transferring of files from S3 to a SFTP server.

class airflow.providers.amazon.aws.transfers.s3_to_sftp.S3ToSFTPOperator(*, s3_bucket, s3_key, sftp_path, sftp_conn_id='ssh_default', aws_conn_id='aws_default', **kwargs)[source]

Bases: airflow.models.BaseOperator

This operator enables the transferring of files from S3 to a SFTP server.

See also

For more information on how to use this operator, take a look at the guide: Amazon S3 To SFTP transfer operator

Parameters
  • sftp_conn_id (str) – The sftp connection id. The name or identifier for establishing a connection to the SFTP server.

  • sftp_path (str) – The sftp remote path. This is the specified file path for uploading file to the SFTP server.

  • aws_conn_id (str | None) – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • s3_bucket (str) – The targeted s3 bucket. This is the S3 bucket from where the file is downloaded.

  • s3_key (str) – The targeted s3 key. This is the specified file path for downloading the file from S3.

template_fields: collections.abc.Sequence[str] = ('s3_key', 'sftp_path', 's3_bucket')[source]
static get_s3_key(s3_key)[source]

Parse the correct format for S3 keys regardless of how the S3 url is passed.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?