agents.vectordbs
#
The following vector DB specification classes are meant to define a comman interface for initialization of vector DBs. Currently the only supported vector DB is Chroma.
Module Contents#
Classes#
API#
- class agents.vectordbs.ChromaDB#
Bases:
agents.vectordbs.DB
Chroma is the open-source AI application database. It provides embeddings, vector search, document storage, full-text search, metadata filtering, and multi-modal retreival support.
- Parameters:
username (Optional[str], optional) – The username for authentication. Defaults to None.
password (Optional[str], optional) – The password for authentication. Defaults to None.
embeddings (str, optional) – Embedding backend to use. Choose from “ollama” or “sentence-transformers”.
checkpoint (str, optional) – The model checkpoint to use for embeddings. For example, “bge-large:latest”.
ollama_host (str, optional) – Host address for the Ollama service (used if embeddings=“ollama”).
ollama_port (int, optional) – Port number for the Ollama service.
init_timeout (int, optional) – The timeout in seconds for the initialization process. Defaults to 10 minutes (600 seconds).
To use ChromaDB with the supported embedding backends, install the following Python packages:
pip install ollama # if using ollama (requires separate Ollama runtime) pip install sentence-transformers # if using sentence-transformers
If using the Ollama backend, make sure the Ollama server is running and accessible at the specified host and port.
Example Usage:
from agents.vectordbs import ChromaDB chroma_config = ChromaDB( embeddings='ollama', checkpoint='bge-large:latest', ollama_host='localhost', ollama_port=11434 )
- get_init_params() Dict #
Get init params from models