airflow.providers.common.ai.operators.llm¶
Operator for general-purpose LLM calls.
Classes¶
Call an LLM with a prompt and return the output. |
Module Contents¶
- class airflow.providers.common.ai.operators.llm.LLMOperator(*, prompt, llm_conn_id, model_id=None, system_prompt='', output_type=str, agent_params=None, require_approval=False, approval_timeout=None, allow_modifications=False, **kwargs)[source]¶
Bases:
airflow.providers.common.compat.sdk.BaseOperator,airflow.providers.common.ai.mixins.approval.LLMApprovalMixinCall an LLM with a prompt and return the output.
Uses a
PydanticAIHookfor LLM access. Supports plain string output (default) and structured output via a PydanticBaseModel. Whenoutput_typeis aBaseModelsubclass, the result is serialized viamodel_dump()for XCom.- Parameters:
prompt (str) – The prompt to send to the LLM.
llm_conn_id (str) – Connection ID for the LLM provider.
model_id (str | None) – Model identifier (e.g.
"openai:gpt-5"). Overrides the model stored in the connection’s extra field.system_prompt (str) – System-level instructions for the LLM agent.
output_type (type) – Expected output type. Default
str. Set to a PydanticBaseModelsubclass for structured output.agent_params (dict[str, Any] | None) – Additional keyword arguments passed to the pydantic-ai
Agentconstructor (e.g.retries,model_settings,tools). See pydantic-ai Agent docs for the full list.require_approval (bool) – If
True, the task defers after generating output and waits for a human reviewer to approve or reject via the HITL interface. DefaultFalse.approval_timeout (datetime.timedelta | None) – Maximum time to wait for a review. When exceeded, the task fails with
TimeoutError.allow_modifications (bool) – If
True, the reviewer can edit the output before approving. The modified value is returned as the task result. DefaultFalse.
- template_fields: collections.abc.Sequence[str] = ('prompt', 'llm_conn_id', 'model_id', 'system_prompt', 'agent_params')[source]¶
- property llm_hook: airflow.providers.common.ai.hooks.pydantic_ai.PydanticAIHook[source]¶
Return PydanticAIHook for the configured LLM connection.