trinity.common.models.external_model module#

class trinity.common.models.external_model.ExternalModel(config: InferenceModelConfig)[源代码]#

基类:InferenceModel

Inference model backed by an external OpenAI-compatible API.

__init__(config: InferenceModelConfig) None[源代码]#
async prepare() None[源代码]#

Prepare the model before inference.

async chat(messages: List[Dict], **kwargs) Sequence[Experience][源代码]#

Generate experiences from a list of history chat messages in async.

async chat_async(messages: List[Dict], **kwargs) Sequence[Experience][源代码]#
async generate_async(prompt: str | List[Dict], **kwargs) Sequence[Experience][源代码]#
async generate(prompt: str | List[Dict], **kwargs) Sequence[Experience][源代码]#

Generate a responses from a prompt in async.

async logprobs(token_ids: List[int], **kwargs) Tensor[源代码]#

Generate logprobs for a list of tokens in async.

async convert_messages_to_experience(messages: List[dict], tools: List[dict] | None = None, temperature: float | None = None) Experience[源代码]#

Convert a list of messages into an experience in async.

async sync_model(model_version: int) int[源代码]#

Sync the model with the latest model_version.

get_model_version() int[源代码]#

Get the checkpoint version.

get_api_key() str[源代码]#

Get the API key.

get_api_server_url() str | None[源代码]#

Get the API server URL if available.