airflow.providers.amazon.aws.transfers.sftp_to_s3

Module Contents

Classes

SFTPToS3Operator

Transfer files from an SFTP server to Amazon S3.

class airflow.providers.amazon.aws.transfers.sftp_to_s3.SFTPToS3Operator(*, s3_bucket, s3_key, sftp_path, sftp_conn_id='ssh_default', s3_conn_id='aws_default', use_temp_file=True, fail_on_file_not_exist=True, **kwargs)[source]

Bases: airflow.models.BaseOperator

Transfer files from an SFTP server to Amazon S3.

See also

For more information on how to use this operator, take a look at the guide: SFTP to Amazon S3 transfer operator

Parameters
  • sftp_conn_id (str) – The sftp connection id. The name or identifier for establishing a connection to the SFTP server.

  • sftp_path (str) – The sftp remote path. This is the specified file path for downloading the file from the SFTP server.

  • s3_conn_id (str) – The s3 connection id. The name or identifier for establishing a connection to S3

  • s3_bucket (str) – The targeted s3 bucket. This is the S3 bucket to where the file is uploaded.

  • s3_key (str) – The targeted s3 key. This is the specified path for uploading the file to S3.

  • use_temp_file (bool) – If True, copies file first to local, if False streams file from SFTP to S3.

  • fail_on_file_not_exist (bool) – If True, operator fails when file does not exist, if False, operator will not fail and skips transfer. Default is True.

template_fields: collections.abc.Sequence[str] = ('s3_key', 'sftp_path', 's3_bucket')[source]
static get_s3_key(s3_key)[source]

Parse the correct format for S3 keys regardless of how the S3 url is passed.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?