agents.clients
#
Clients are standard interfaces for components to interact with ML models or vector DBs served by various platforms. Currently EmbodiedAgents provides the following clients, which cover the most popular open source model deployment platforms. Simple clients can be easily implemented for other platforms and the use of unnecessarily heavy duct-tape “AI” frameworks on the robot is discouraged 😅.
Note
Some clients might need additional dependacies, which are provided in the following table. If missing the user will also be prompted for them at runtime.
Platform |
Client |
Description |
---|---|---|
Generic |
A generic client for interacting with OpenAI-compatible APIs, including vLLM, ms-swift, lmdeploy, Google Gemini, etc. Supports both standard and streaming responses, and works with LLMS and multimodal LLMs. Designed to be compatible with any API following the OpenAI standard. |
|
RoboML |
An HTTP client for interacting with ML models served on RoboML. Supports streaming outputs. |
|
RoboML |
A WebSocket-based client for persistent interaction with RoboML-hosted ML models. Particularly useful for low-latency streaming of audio or text data. |
|
RoboML |
A Redis Serialization Protocol (RESP) based client for ML models served via RoboML.
Requires |
|
Ollama |
An HTTP client for interacting with ML models served on Ollama. Supports LLMs/MLLMs and embedding models. It can be invoked with the generic OllamaModel.
Requires |
|
ChromaDB |
An HTTP client for interacting with a ChromaDB instance running as a server.
Ensure that a ChromaDB server is active using:
|