airflow.providers.common.ai.operators.llm_branch

LLM-driven branching operator.

Classes

LLMBranchOperator

Ask an LLM to choose which downstream task(s) to execute.

Module Contents

class airflow.providers.common.ai.operators.llm_branch.LLMBranchOperator(*, allow_multiple_branches=False, **kwargs)[source]

Bases: airflow.providers.common.ai.operators.llm.LLMOperator, airflow.providers.standard.operators.branch.BranchMixIn

Ask an LLM to choose which downstream task(s) to execute.

Downstream task IDs are discovered automatically from the DAG topology and presented to the LLM as a constrained enum via pydantic-ai structured output. No text parsing or manual validation is needed.

Parameters:
  • prompt – The prompt to send to the LLM.

  • llm_conn_id – Connection ID for the LLM provider.

  • model_id – Model identifier (e.g. "openai:gpt-5"). Overrides the model stored in the connection’s extra field.

  • system_prompt – System-level instructions for the LLM agent.

  • allow_multiple_branches (bool) – When False (default) the LLM returns a single task ID. When True the LLM may return one or more task IDs.

  • agent_params – Additional keyword arguments passed to the pydantic-ai Agent constructor (e.g. retries, model_settings, tools).

inherits_from_skipmixin = True[source]

Used to determine if an Operator is inherited from SkipMixin or its subclasses (e.g., BranchMixin).

template_fields: collections.abc.Sequence[str] = ('prompt', 'llm_conn_id', 'model_id', 'system_prompt', 'agent_params')[source]
allow_multiple_branches = False[source]
execute(context)[source]

Derive when creating an operator.

The main method to execute the task. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?