trinity.common.models.external_model module#

class trinity.common.models.external_model.ExternalModel(config: InferenceModelConfig)[source]#

Bases: InferenceModel

Inference model backed by an external OpenAI-compatible API.

__init__(config: InferenceModelConfig) None[source]#
async prepare() None[source]#

Prepare the model before inference.

async chat(messages: List[Dict], **kwargs) Sequence[Experience][source]#

Generate experiences from a list of history chat messages in async.

async chat_async(messages: List[Dict], **kwargs) Sequence[Experience][source]#
async generate_async(prompt: str | List[Dict], **kwargs) Sequence[Experience][source]#
async generate(prompt: str | List[Dict], **kwargs) Sequence[Experience][source]#

Generate a responses from a prompt in async.

async logprobs(token_ids: List[int], **kwargs) Tensor[source]#

Generate logprobs for a list of tokens in async.

async convert_messages_to_experience(messages: List[dict], tools: List[dict] | None = None, temperature: float | None = None) Experience[source]#

Convert a list of messages into an experience in async.

async sync_model(model_version: int) int[source]#

Sync the model with the latest model_version.

get_model_version() int[source]#

Get the checkpoint version.

get_api_key() str[source]#

Get the API key.

get_api_server_url() str | None[source]#

Get the API server URL if available.