trinity.common.models.external_model module#
- class trinity.common.models.external_model.ExternalModel(config: InferenceModelConfig)[source]#
Bases:
InferenceModelInference model backed by an external OpenAI-compatible API.
- __init__(config: InferenceModelConfig) None[source]#
- async chat(messages: List[Dict], **kwargs) Sequence[Experience][source]#
Generate experiences from a list of history chat messages in async.
- async chat_async(messages: List[Dict], **kwargs) Sequence[Experience][source]#
- async generate_async(prompt: str | List[Dict], **kwargs) Sequence[Experience][source]#
- async generate(prompt: str | List[Dict], **kwargs) Sequence[Experience][source]#
Generate a responses from a prompt in async.
- async logprobs(token_ids: List[int], **kwargs) Tensor[source]#
Generate logprobs for a list of tokens in async.
- async convert_messages_to_experience(messages: List[dict], tools: List[dict] | None = None, temperature: float | None = None) Experience[source]#
Convert a list of messages into an experience in async.