Templates reference

Variables, macros and filters can be used in templates (see the Jinja Templating section)

The following come for free out of the box with Airflow. Additional custom macros can be added globally through Plugins, or at a DAG level through the DAG.user_defined_macros argument.

Variables

The Airflow engine passes a few variables by default that are accessible in all templates

Variable

Type

Description

{{ data_interval_start }}

pendulum.DateTime

Start of the data interval. Added in version 2.2.

{{ data_interval_end }}

pendulum.DateTime

End of the data interval. Added in version 2.2.

{{ logical_date }}

pendulum.DateTime

A date-time that logically identifies the current DAG run. This value does not contain any semantics, but is simply a value for identification.
Use data_interval_start and data_interval_end instead if you want a value that has real-world semantics,
such as to get a slice of rows from the database based on timestamps.

{{ ds }}

str

The DAG run’s logical date as YYYY-MM-DD.
Same as {{ logical_date | ds }}.

{{ ds_nodash }}

str

Same as {{ logical_date | ds_nodash }}.

{{ exception }}

None | str | Exception KeyboardInterrupt

Error occurred while running task instance.


{{ ts }}

str

Same as {{ logical_date | ts }}.
Example: 2018-01-01T00:00:00+00:00.

{{ ts_nodash_with_tz }}

str

Same as {{ logical_date | ts_nodash_with_tz }}.
Example: 20180101T000000+0000.

{{ ts_nodash }}

str

Same as {{ logical_date | ts_nodash }}.
Example: 20180101T000000.

{{ prev_data_interval_start_success }}

pendulum.DateTime | None

Start of the data interval of the prior successful DagRun.
Added in version 2.2.

{{ prev_data_interval_end_success }}

pendulum.DateTime | None

End of the data interval of the prior successful DagRun.
Added in version 2.2.

{{ prev_start_date_success }}

pendulum.DateTime | None

Start date from prior successful DagRun (if available).

{{ prev_end_date_success }}

pendulum.DateTime | None

End date from prior successful DagRun (if available).

{{ inlets }}

list

List of inlets declared on the task.

{{ inlet_events }}

dict[str, …]

Access past events of inlet assets. See Assets. Added in version 2.10.

{{ outlets }}

list

List of outlets declared on the task.

{{ outlet_events }}

dict[str, …]

Accessors to attach information to asset events that will be emitted by the current task.
See Assets. Added in version 2.10.

{{ dag }}

DAG

The currently running DAG. You can read more about DAGs in DAGs.

{{ task }}

BaseOperator

The currently running BaseOperator. You can read more about Tasks in Operators

{{ macros }}

A reference to the macros package. See Macros below.

{{ task_instance }}

TaskInstance

The currently running TaskInstance.

{{ ti }}

TaskInstance

Same as {{ task_instance }}.

{{ params }}

dict[str, Any]

The user-defined params. This can be overridden by the mapping
passed to trigger_dag -c if dag_run_conf_overrides_params
is enabled in airflow.cfg.

{{ var.value }}

Airflow variables. See Airflow Variables in Templates below.

{{ var.json }}

Airflow variables. See Airflow Variables in Templates below.

{{ conn }}

Airflow connections. See Airflow Connections in Templates below.

{{ task_instance_key_str }}

str

A unique, human-readable key to the task instance. The format is
{dag_id}__{task_id}__{ds_nodash}.

{{ conf }}

AirflowConfigParser

The full configuration object representing the content of your
airflow.cfg. See airflow.configuration.conf.

{{ run_id }}

str

The currently running DagRun run ID.

{{ dag_run }}

DagRun

The currently running DagRun.

{{ test_mode }}

bool

Whether the task instance was run by the airflow test CLI.

{{ map_index_template }}

None | str

Template used to render the expanded task instance of a mapped task. Setting this value will be reflected in the rendered result.

{{ expanded_ti_count }}

int | None

Number of task instances that a mapped task was expanded into. If
the current task is not mapped, this should be None.
Added in version 2.5.

{{ triggering_asset_events }}

dict[str, list[AssetEvent]]

If in an Asset Scheduled DAG, a map of Asset URI to a list of triggering AssetEvent
(there may be more than one, if there are multiple Assets with different frequencies).
Read more here Assets.
Added in version 2.4.

Note

The DAG run’s logical date, and values derived from it, such as ds and ts, should not be considered unique in a DAG. Use run_id instead.

Accessing Airflow context variables from TaskFlow tasks

While @task decorated tasks don’t support rendering jinja templates passed as arguments, all of the variables listed above can be accessed directly from tasks. The following code block is an example of accessing a task_instance object from its task:

from airflow.models.taskinstance import TaskInstance
from airflow.models.dagrun import DagRun


@task
def print_ti_info(task_instance: TaskInstance | None = None, dag_run: DagRun | None = None):
    print(f"Run ID: {task_instance.run_id}")  # Run ID: scheduled__2023-08-09T00:00:00+00:00
    print(f"Duration: {task_instance.duration}")  # Duration: 0.972019
    print(f"DAG Run queued at: {dag_run.queued_at}")  # 2023-08-10 00:00:01+02:20

Note that you can access the object’s attributes and methods with simple dot notation. Here are some examples of what is possible: {{ task.owner }}, {{ task.task_id }}, {{ ti.hostname }}, … Refer to the models documentation for more information on the objects’ attributes and methods.

Airflow Variables in Templates

The var template variable allows you to access Airflow Variables. You can access them as either plain-text or JSON. If you use JSON, you are also able to walk nested structures, such as dictionaries like: {{ var.json.my_dict_var.key1 }}.

It is also possible to fetch a variable by string if needed (for example your variable key contains dots) with {{ var.value.get('my.var', 'fallback') }} or {{ var.json.get('my.dict.var', {'key1': 'val1'}) }}. Defaults can be supplied in case the variable does not exist.

Airflow Connections in Templates

Similarly, Airflow Connections data can be accessed via the conn template variable. For example, you could use expressions in your templates like {{ conn.my_conn_id.login }}, {{ conn.my_conn_id.password }}, etc.

Just like with var it’s possible to fetch a connection by string (e.g. {{ conn.get('my_conn_id_'+index).host }} ) or provide defaults (e.g {{ conn.get('my_conn_id', {"host": "host1", "login": "user1"}).host }}).

Additionally, the extras field of a connection can be fetched as a Python Dictionary with the extra_dejson field, e.g. conn.my_aws_conn_id.extra_dejson.region_name would fetch region_name out of extras. This way, defaults in extras can be provided as well (e.g. {{ conn.my_aws_conn_id.extra_dejson.get('region_name', 'Europe (Frankfurt)') }}).

Filters

Airflow defines some Jinja filters that can be used to format values.

For example, using {{ logical_date | ds }} will output the logical_date in the YYYY-MM-DD format.

Filter

Operates on

Description

ds

datetime

Format the datetime as YYYY-MM-DD

ds_nodash

datetime

Format the datetime as YYYYMMDD

ts

datetime

Same as .isoformat(), Example: 2018-01-01T00:00:00+00:00

ts_nodash

datetime

Same as ts filter without -, : or TimeZone info. Example: 20180101T000000

ts_nodash_with_tz

datetime

As ts filter without - or :. Example 20180101T000000+0000

Macros

Macros are a way to expose objects to your templates and live under the macros namespace in your templates.

A few commonly used libraries and methods are made available.

Variable

Description

macros.datetime

The standard lib’s datetime.datetime

macros.timedelta

The standard lib’s datetime.timedelta

macros.dateutil

A reference to the dateutil package

macros.time

The standard lib’s time

macros.uuid

The standard lib’s uuid

macros.random

The standard lib’s random.random

Some airflow specific macros are also defined:

airflow.macros.datetime_diff_for_humans(dt, since=None)[source]

Return a human-readable/approximate difference between datetimes.

When only one datetime is provided, the comparison will be based on now.

Parameters
  • dt (Any) – The datetime to display the diff for

  • since (DateTime | None) – When to display the date from. If None then the diff is between dt and now.

airflow.macros.ds_add(ds, days)[source]

Add or subtract days from a YYYY-MM-DD.

Parameters
  • ds (str) – anchor date in YYYY-MM-DD format to add to

  • days (int) – number of days to add to the ds, you can use negative values

>>> ds_add("2015-01-01", 5)
'2015-01-06'
>>> ds_add("2015-01-06", -5)
'2015-01-01'
airflow.macros.ds_format(ds, input_format, output_format)[source]

Output datetime string in a given format.

Parameters
  • ds (str) – Input string which contains a date.

  • input_format (str) – Input string format (e.g., ‘%Y-%m-%d’).

  • output_format (str) – Output string format (e.g., ‘%Y-%m-%d’).

>>> ds_format("2015-01-01", "%Y-%m-%d", "%m-%d-%y")
'01-01-15'
>>> ds_format("1/5/2015", "%m/%d/%Y", "%Y-%m-%d")
'2015-01-05'
>>> ds_format("12/07/2024", "%d/%m/%Y", "%A %d %B %Y", "en_US")
'Friday 12 July 2024'
airflow.macros.ds_format_locale(ds, input_format, output_format, locale=None)[source]

Output localized datetime string in a given Babel format.

Parameters
  • ds (str) – Input string which contains a date.

  • input_format (str) – Input string format (e.g., ‘%Y-%m-%d’).

  • output_format (str) – Output string Babel format (e.g., yyyy-MM-dd).

  • locale (Locale | str | None) – Locale used to format the output string (e.g., ‘en_US’). If locale not specified, default LC_TIME will be used and if that’s also not available, ‘en_US’ will be used.

>>> ds_format("2015-01-01", "%Y-%m-%d", "MM-dd-yy")
'01-01-15'
>>> ds_format("1/5/2015", "%m/%d/%Y", "yyyy-MM-dd")
'2015-01-05'
>>> ds_format("12/07/2024", "%d/%m/%Y", "EEEE dd MMMM yyyy", "en_US")
'Friday 12 July 2024'

New in version 2.10.0.

airflow.macros.random() x in the interval [0, 1).

Was this entry helpful?