Pydantic AI Connection¶
The Pydantic AI connection type configures access to LLM providers via the pydantic-ai framework. A single connection type works with any provider that pydantic-ai supports: OpenAI, Anthropic, Google, Bedrock, Groq, Mistral, Ollama, vLLM, and others.
Default Connection IDs¶
The PydanticAIHook uses pydanticai_default by default.
Configuring the Connection¶
- Model
The model identifier in
provider:modelformat. This field appears as a dedicated input in the connection form (viaconn-fields) and stores its value inextra["model"].Examples:
openai:gpt-5.3,anthropic:claude-sonnet-4-20250514,bedrock:us.anthropic.claude-opus-4-6-v1:0,google:gemini-2.0-flashThe model can also be overridden at the hook/operator level via the
model_idparameter.- API Key (Password field)
The API key for your LLM provider. Required for API-key-based providers (OpenAI, Anthropic, Groq, Mistral). Leave empty for providers using environment-based auth (Bedrock via
AWS_PROFILE, Vertex viaGOOGLE_APPLICATION_CREDENTIALS).- Host (optional)
Base URL for the provider’s API. Only needed for custom endpoints:
Ollama:
http://localhost:11434/v1vLLM:
http://localhost:8000/v1Azure OpenAI:
https://<resource>.openai.azure.com/openai/deployments/<deployment>Any OpenAI-compatible API: the base URL of that service
- Extra (JSON, optional)
A JSON object with additional configuration. Programmatic users can set the model directly in extra:
{"model": "openai:gpt-5.3"}
When using the UI, the “Model” field above writes to this same location automatically.
Examples¶
OpenAI
{
"conn_type": "pydanticai",
"password": "sk-...",
"extra": "{\"model\": \"openai:gpt-5.3\"}"
}
Anthropic
{
"conn_type": "pydanticai",
"password": "sk-ant-...",
"extra": "{\"model\": \"anthropic:claude-opus-4-6\"}"
}
Ollama (local)
{
"conn_type": "pydanticai",
"host": "http://localhost:11434/v1",
"extra": "{\"model\": \"openai:llama3\"}"
}
AWS Bedrock
Leave password empty and configure AWS_PROFILE or IAM role in the environment:
{
"conn_type": "pydanticai",
"extra": "{\"model\": \"bedrock:us.anthropic.claude-opus-4-6-v1:0\"}"
}
Google Vertex AI
Leave password empty and configure GOOGLE_APPLICATION_CREDENTIALS in the environment:
{
"conn_type": "pydanticai",
"extra": "{\"model\": \"google:gemini-2.0-flash\"}"
}
Model Resolution Order¶
The hook reads the model from these sources in priority order:
model_idparameter on the hook/operatormodelin the connection’s extra JSON (set by the “Model” conn-field in the UI)