agents.clients.generic#

Module Contents#

Classes#

GenericHTTPClient

A generic client for interacting with OpenAI-compatible APIs, including vLLM, ms-swift, lmdeploy, Google Gemini etc. This client works with LLM multimodal LLM models and supports both standard and streaming responses. It is designed to be compatible with any API that follows the OpenAI standard.

API#

class agents.clients.generic.GenericHTTPClient(model: Union[agents.models.LLM, Dict], host: str = '127.0.0.1', port: Optional[int] = 8000, inference_timeout: int = 30, api_key: Optional[str] = None, logging_level: str = 'info', **kwargs)#

Bases: agents.clients.model_base.ModelClient

A generic client for interacting with OpenAI-compatible APIs, including vLLM, ms-swift, lmdeploy, Google Gemini etc. This client works with LLM multimodal LLM models and supports both standard and streaming responses. It is designed to be compatible with any API that follows the OpenAI standard.

serialize() Dict#

Get client json

Return type:

Dict

check_connection() None#

initialize.

Return type:

None

initialize() None#

initialize.

Return type:

None

inference(inference_input: Dict[str, Any]) Optional[Dict]#

inference.

Parameters:

inference_input (dict[str, Any])

Return type:

dict | None

deinitialize()#

deinitialize.