airflow.timetables.assets

Classes

AssetOrTimeSchedule

Combine time-based scheduling with event-based scheduling.

Module Contents

class airflow.timetables.assets.AssetOrTimeSchedule(*, timetable, assets)[source]

Bases: airflow.timetables.simple.AssetTriggeredTimetable

Combine time-based scheduling with event-based scheduling.

timetable[source]
description = 'Triggered by assets or Uninferable'[source]

Human-readable description of the timetable.

For example, this can produce something like 'At 21:30, only on Friday' from the cron expression '30 21 * * 5'. This is used in the webserver UI.

periodic[source]

Whether this timetable runs periodically.

This defaults to and should generally be True, but some special setups like schedule=None and "@once" set it to False.

can_be_scheduled[source]

Whether this timetable can actually schedule runs in an automated manner.

This defaults to and should generally be True (including non periodic execution types like @once and data triggered tables), but NullTimetable sets this to False.

active_runs_limit[source]

Maximum active runs that can be active at one time for a DAG.

This is called during DAG initialization, and the return value is used as the DAG’s default max_active_runs. This should generally return None, but there are good reasons to limit DAG run parallelism in some cases, such as for ContinuousTimetable.

classmethod deserialize(data)[source]

Deserialize a timetable from data.

This is called when a serialized DAG is deserialized. data will be whatever was returned by serialize during DAG serialization. The default implementation constructs the timetable without any arguments.

serialize()[source]

Serialize the timetable for JSON encoding.

This is called during DAG serialization to store timetable information in the database. This should return a JSON-serializable dict that will be fed into deserialize when the DAG is deserialized. The default implementation returns an empty dict.

validate()[source]

Validate the timetable is correctly specified.

Override this method to provide run-time validation raised when a DAG is put into a dagbag. The default implementation does nothing.

Raises:

AirflowTimetableInvalid on validation failure.

property summary: str[source]

A short summary for the timetable.

This is used to display the timetable in the web UI. A cron expression timetable, for example, can use this to display the expression. The default implementation returns the timetable’s type name.

infer_manual_data_interval(*, run_after)[source]

When a DAG run is manually triggered, infer a data interval for it.

This is used for e.g. manually-triggered runs, where run_after would be when the user triggers the run. The default implementation raises NotImplementedError.

next_dagrun_info(*, last_automated_data_interval, restriction)[source]

Provide information to schedule the next DagRun.

The default implementation raises NotImplementedError.

Parameters:
Returns:

Information on when the next DagRun can be scheduled. None means a DagRun will not happen. This does not mean no more runs will be scheduled even again for this DAG; the timetable can return a DagRunInfo object when asked at another time.

Return type:

airflow.timetables.base.DagRunInfo | None

generate_run_id(*, run_type, **kwargs)[source]

Was this entry helpful?