airflow.providers.amazon.aws.bundles.s3¶
Classes¶
S3 DAG bundle - exposes a directory in S3 as a DAG bundle. |
Module Contents¶
- class airflow.providers.amazon.aws.bundles.s3.S3DagBundle(*, aws_conn_id=AwsBaseHook.default_conn_name, bucket_name, prefix='', **kwargs)[source]¶
Bases:
airflow.dag_processing.bundles.base.BaseDagBundle
S3 DAG bundle - exposes a directory in S3 as a DAG bundle.
This allows Airflow to load DAGs directly from an S3 bucket.
- Parameters:
aws_conn_id (str) – Airflow connection ID for AWS. Defaults to AwsBaseHook.default_conn_name.
bucket_name (str) – The name of the S3 bucket containing the DAG files.
prefix (str) – Optional subdirectory within the S3 bucket where the DAGs are stored. If None, DAGs are assumed to be at the root of the bucket (Optional).
- s3_dags_dir: pathlib.Path[source]¶
- initialize()[source]¶
Initialize the bundle.
This method is called by the DAG processor and worker before the bundle is used, and allows for deferring expensive operations until that point in time. This will only be called when Airflow needs the bundle files on disk - some uses only need to call the view_url method, which can run without initializing the bundle.
This method must ultimately be safe to call concurrently from different threads or processes. If it isn’t naturally safe, you’ll need to make it so with some form of locking. There is a lock context manager on this class available for this purpose.
- get_current_version()[source]¶
Return the current version of the DAG bundle. Currently not supported.
- property path: pathlib.Path[source]¶
Return the local path to the DAG files.