airflow.providers.common.ai.hooks.pydantic_ai¶
Attributes¶
Classes¶
Hook for LLM access via pydantic-ai. |
Module Contents¶
- class airflow.providers.common.ai.hooks.pydantic_ai.PydanticAIHook(llm_conn_id=default_conn_name, model_id=None, **kwargs)[source]¶
Bases:
airflow.providers.common.compat.sdk.BaseHookHook for LLM access via pydantic-ai.
Manages connection credentials and model creation. Uses pydantic-ai’s model inference to support any provider (OpenAI, Anthropic, Google, Bedrock, Ollama, vLLM, etc.).
- Connection fields:
Model (conn-field): Model in
provider:modelformat (e.g."anthropic:claude-sonnet-4-20250514")password: API key (OpenAI, Anthropic, Groq, Mistral, etc.)
host: Base URL (optional — for custom endpoints like Ollama, vLLM, Azure)
Cloud providers (Bedrock, Vertex) that use native auth chains should leave password empty and configure environment-based auth (
AWS_PROFILE,GOOGLE_APPLICATION_CREDENTIALS).- Parameters:
- static get_ui_field_behaviour()[source]¶
Return custom field behaviour for the Airflow connection form.
- get_conn()[source]¶
Return a configured pydantic-ai Model.
Reads API key from connection password, base_url from connection host, and model from (in priority order):
model_idparameter on the hookextra["model"]on the connection (set by the “Model” conn-field in the UI)
The result is cached for the lifetime of this hook instance.
- create_agent(output_type: type[OutputT], *, instructions: str, **agent_kwargs) pydantic_ai.Agent[None, OutputT][source]¶
- create_agent(*, instructions: str, **agent_kwargs) pydantic_ai.Agent[None, str]
Create a pydantic-ai Agent configured with this hook’s model.
- Parameters:
output_type – The expected output type from the agent (default:
str).instructions – System-level instructions for the agent.
agent_kwargs – Additional keyword arguments passed to the Agent constructor.
- test_connection()[source]¶
Test connection by resolving the model.
Validates that the model string is valid, the provider package is installed, and the provider class can be instantiated. Does NOT make an LLM API call — that would be expensive, flaky, and fail for reasons unrelated to connectivity (quotas, billing, rate limits).