agents.components.semantic_router#

Module Contents#

Classes#

RouterMode

Enum representing the operational modes of the SemanticRouter. Modes:

SemanticRouter

A unified component that routes semantic information from input topics to output topics.

API#

class agents.components.semantic_router.RouterMode(*args, **kwds)#

Bases: enum.Enum

Enum representing the operational modes of the SemanticRouter. Modes:

  • LLM: Agentic mode using LLM for intent analysis and routing

  • VECTOR: Vector mode using embeddings and vector database for routing

name()#

The name of the Enum member.

value()#

The value of the Enum member.

class agents.components.semantic_router.SemanticRouter(*, inputs: List[agents.ros.Topic], routes: List[agents.ros.Route], config: Optional[Union[agents.config.SemanticRouterConfig, agents.config.LLMConfig]] = None, db_client: Optional[agents.clients.db_base.DBClient] = None, model_client: Optional[agents.clients.model_base.ModelClient] = None, default_route: Optional[agents.ros.Route] = None, component_name: str, **kwargs)#

Bases: agents.components.llm.LLM

A unified component that routes semantic information from input topics to output topics.

This component can operate in two modes:

  1. Vector Mode (Standard): Uses a vector database to route inputs based on embedding similarity to route samples.

  2. LLM Mode (Agentic): Uses an LLM to intelligently analyze intent and route inputs via function calling.

The mode is determined automatically based on the client provided (db_client vs model_client).

Parameters:
  • inputs (list[Topic]) – A list of input text topics that this component will subscribe to.

  • routes (list[Route]) – A list of pre-defined routes that publish incoming input to the routed output topics.

  • default_route (Optional[Route]) – An optional route that specifies the default behavior when no specific route matches. In Vector Mode, this is used based on distance threshold. In LLM Mode, this is used if the model fails to select a route.

  • config (Union[SemanticRouterConfig, LLMConfig]) – The configuration object. accepts SemanticRouterConfig (for vector mode parameters) or LLMConfig (if specific LLM settings are needed). Defaults to SemanticRouterConfig.

  • db_client (Optional[DBClient]) – (Vector Mode) A database client used to store and retrieve routing information.

  • model_client (Optional[ModelClient]) – (LLM Mode) A model client used for intelligent intent analysis and tool calling.

  • component_name (str) – The name of this Semantic Router component (default: “router_component”).

  • kwargs – Additional keyword arguments.

Example usage (Vector Mode):

# ... define topics and routes ...
config = SemanticRouterConfig(router_name="my_vector_router")
db_client = HTTPDBClient(db=ChromaDB(host='localhost', port=8080))

router = SemanticRouter(
    inputs=[input_text],
    routes=[route1, route2],
    db_client=db_client,
    config=config,
    component_name="router"
)

Example usage (LLM Mode):

# ... define topics and routes ...
model_client = OllamaClient(model_name="llama3", checkpoint=llama3.1:latest,
                            init_params={"temperature": 0.0})

router = SemanticRouter(
    inputs=[input_text],
    routes=[route1, route2],
    model_client=model_client,
    component_name="smart_router"
)
custom_on_configure()#

Create model client if provided and initialize model.

custom_on_deactivate()#

Deactivate component.

abstractmethod set_component_prompt(template: Union[str, pathlib.Path]) None#

LOCKED: The SemanticRouter does not use component prompts.

abstractmethod set_topic_prompt(input_topic: agents.ros.Topic, template: Union[str, pathlib.Path]) None#

LOCKED: Input topics are routed as-is.

abstractmethod register_tool(tool, tool_description, send_tool_response_to_model=False) None#

LOCKED: Tools are automatically generated from ‘Route’ objects.

abstractmethod add_documents(ids, metadatas, documents) None#

LOCKED: Document storage is managed via Route samples.

set_system_prompt(prompt: str) None#

Set system prompt for the model, which defines the models ‘personality’.

Parameters:

prompt – string or a path to a file containing the string.

Return type:

None

Example usage:

llm_component = LLM(inputs=[text0],
                    outputs=[text1],
                    model_client=model_client,
                    config=config,
                    component_name='llama_component')
llm_component.set_system_prompt(prompt="You are an amazing and funny robot. You answer all questions with short and concise answers.")
property additional_model_clients: Optional[Dict[str, agents.clients.model_base.ModelClient]]#

Get additional model clients.

change_model_client(model_client_name: str) bool#

Change the model client

This method can change the model client that the component is using, at runtime.

It can be invoked as a consequent action in response to an event. For example if one client communicating with a cloud model becomes unresponsive, one can replace it with another client for a locally deployed model.

property warmup: bool#

Enable warmup of the model.

custom_on_activate()#

Custom configuration for creating triggers.

create_all_subscribers()#

Override to handle trigger topics and fixed inputs. Called by parent BaseComponent

activate_all_triggers() None#

Activates component triggers by attaching execution step to callbacks

destroy_all_subscribers() None#

Destroys all node subscribers

trigger(trigger: Union[agents.ros.Topic, List[agents.ros.Topic], float, agents.ros.Event, None]) None#

Set component trigger

validate_topics(topics: Sequence[Union[agents.ros.Topic, agents.ros.FixedInput]], allowed_topic_types: Optional[Dict[str, List[Union[Type[agents.ros.SupportedType], List[Type[agents.ros.SupportedType]]]]]] = None, topics_direction: str = 'Topics')#

Verify component specific inputs or outputs using allowed topics if provided