airflow.providers.ssh.operators.ssh_remote_job¶
SSH Remote Job Operator for deferrable remote command execution.
Classes¶
Execute a command on a remote host via SSH with deferrable monitoring. |
Module Contents¶
- class airflow.providers.ssh.operators.ssh_remote_job.SSHRemoteJobOperator(*, ssh_conn_id, command, remote_host=None, environment=None, remote_base_dir=None, poll_interval=5, log_chunk_size=65536, timeout=None, cleanup='never', remote_os='auto', skip_on_exit_code=None, conn_timeout=None, banner_timeout=30.0, **kwargs)[source]¶
Bases:
airflow.providers.common.compat.sdk.BaseOperatorExecute a command on a remote host via SSH with deferrable monitoring.
This operator submits a job to run detached on the remote host, then uses a trigger to asynchronously monitor the job status and stream logs. This approach is resilient to network interruptions as the remote job continues running independently of the SSH connection.
The remote job is wrapped to: - Run detached from the SSH session (via nohup on POSIX, Start-Process on Windows) - Redirect stdout/stderr to a log file - Write the exit code to a file on completion
- Parameters:
ssh_conn_id (str) – SSH connection ID from Airflow Connections
command (str) – Command to execute on the remote host (templated)
remote_host (str | None) – Override the host from the connection (templated)
environment (dict[str, str] | None) – Environment variables to set for the command (templated)
remote_base_dir (str | None) – Base directory for job artifacts (templated). Defaults to /tmp/airflow-ssh-jobs on POSIX, C:\Windows\Temp\airflow-ssh-jobs on Windows
poll_interval (int) – Seconds between status polls (default: 5)
log_chunk_size (int) – Max bytes to read per poll (default: 65536)
timeout (int | None) – Hard timeout in seconds for the entire operation
cleanup (Literal['never', 'on_success', 'always']) – When to clean up remote job directory: ‘never’, ‘on_success’, or ‘always’ (default: ‘never’)
remote_os (Literal['auto', 'posix', 'windows']) – Remote operating system: ‘auto’, ‘posix’, or ‘windows’ (default: ‘auto’)
skip_on_exit_code (int | collections.abc.Container[int] | None) – Exit codes that should skip the task instead of failing
conn_timeout (int | None) – SSH connection timeout in seconds
banner_timeout (float) – Timeout waiting for SSH banner in seconds
- template_fields: collections.abc.Sequence[str] = ('command', 'environment', 'remote_host', 'remote_base_dir')[source]¶
- template_ext: collections.abc.Sequence[str] = ('.sh', '.bash', '.ps1')[source]¶
- property ssh_hook: airflow.providers.ssh.hooks.ssh.SSHHook[source]¶
Create the SSH hook for command submission.
- execute(context)[source]¶
Submit the remote job and defer to the trigger for monitoring.
- Parameters:
context (airflow.providers.common.compat.sdk.Context) – Airflow task context