airflow.providers.teradata.hooks.tpt

Classes

TptHook

Hook for executing Teradata Parallel Transporter (TPT) operations.

Module Contents

class airflow.providers.teradata.hooks.tpt.TptHook(ssh_conn_id=None, *args, **kwargs)[source]

Bases: airflow.providers.teradata.hooks.ttu.TtuHook

Hook for executing Teradata Parallel Transporter (TPT) operations.

This hook provides methods to execute TPT operations both locally and remotely via SSH. It supports DDL operations using tbuild utility. It extends the TtuHook and integrates with Airflow’s SSHHook for remote execution.

The TPT operations are used to interact with Teradata databases for DDL operations such as creating, altering, or dropping tables.

Features: - Supports both local and remote execution of TPT operations. - Secure file encryption for remote transfers. - Comprehensive error handling and logging. - Resource cleanup and management.

Parameters:

ssh_conn_id (str | None) – SSH connection ID for remote execution. If None, executes locally.

ssh_conn_id = None[source]
ssh_hook = None[source]
execute_ddl(tpt_script, remote_working_dir)[source]

Execute a DDL statement using TPT.

Args:

tpt_script: TPT script content as string or list of strings remote_working_dir: Remote working directory for SSH execution

Returns:

Exit code from the TPT operation

Raises:

ValueError: If tpt_script is empty or invalid RuntimeError: Non-zero tbuild exit status or unexpected execution failure ConnectionError: SSH connection not established or fails TimeoutError: SSH connection/network timeout FileNotFoundError: tbuild binary not found in PATH

on_kill()[source]

Handle cleanup when the task is killed.

This method is called when Airflow needs to terminate the hook, typically during task cancellation or shutdown.

preferred_temp_directory(prefix='tpt_')[source]
get_airflow_home_dir()[source]

Return the Airflow home directory.

Was this entry helpful?