Running Airflow with systemd¶
Airflow can integrate with systemd based systems. This makes watching your
daemons easy as systemd
can take care of restarting a daemon on failures.
In the scripts/systemd
directory, you can find unit files that
have been tested on Redhat based systems. These files can be used as-is by copying them over to
/usr/lib/systemd/system
.
You can find the latest systemd unit files on GitHub: https://github.com/apache/airflow/tree/main/scripts/systemd
Assumptions¶
The following assumptions have been made while creating these unit files:
Airflow runs as the following
user:group
airflow:airflow
.Airflow runs on a Redhat based system.
If this is not the case, appropriate changes will need to be made.
Environment Configuration¶
Please note that environment configuration is picked up from /etc/sysconfig/airflow
.
An example file is supplied within scripts/systemd
.
You can also define configuration at AIRFLOW_HOME
or AIRFLOW_CONFIG
.
Using Virtual Environments¶
Note
If Airflow is installed inside a virtual environment (e.g. venv
or conda
), you must update the ExecStart
line in each systemd unit file to activate the virtualenv first.
Example:
ExecStart=/bin/bash -c 'source /home/airflow/airflow_venv/bin/activate && airflow scheduler'
Replace /home/airflow/airflow_venv/
with the path to your virtual environment.
New Airflow 3.0 Services¶
Since Apache Airflow 3.0, additional components have been split out into separate services. The following new unit files are available:
airflow-triggerer.service
for deferrable task triggeringairflow-api.service
for the standalone REST API server