agents.clients.ollama
#
Module Contents#
Classes#
An HTTP client for interaction with ML models served on ollama |
API#
- class agents.clients.ollama.OllamaClient(model: Union[agents.models.OllamaModel, Dict], host: str = '127.0.0.1', port: int = 11434, inference_timeout: int = 30, init_on_activation: bool = True, logging_level: str = 'info', **kwargs)#
Bases:
agents.clients.model_base.ModelClient
An HTTP client for interaction with ML models served on ollama
- serialize() Dict #
Get client json
- Return type:
Dict
- check_connection() None #
initialize.
- Return type:
None
- initialize() None #
initialize.
- Return type:
None
- inference(inference_input: Dict[str, Any]) Optional[Dict] #
inference.
- Parameters:
inference_input (dict[str, Any])
- Return type:
dict | None
- deinitialize()#
deinitialize.