trinity.common.models.external_model module#
- class trinity.common.models.external_model.ExternalModel(config: InferenceModelConfig)[源代码]#
-
Inference model backed by an external OpenAI-compatible API.
- __init__(config: InferenceModelConfig) None[源代码]#
- async chat(messages: List[Dict], **kwargs) Sequence[Experience][源代码]#
Generate experiences from a list of history chat messages in async.
- async chat_async(messages: List[Dict], **kwargs) Sequence[Experience][源代码]#
- async generate_async(prompt: str | List[Dict], **kwargs) Sequence[Experience][源代码]#
- async generate(prompt: str | List[Dict], **kwargs) Sequence[Experience][源代码]#
Generate a responses from a prompt in async.
- async logprobs(token_ids: List[int], **kwargs) Tensor[源代码]#
Generate logprobs for a list of tokens in async.
- async convert_messages_to_experience(messages: List[dict], tools: List[dict] | None = None, temperature: float | None = None) Experience[源代码]#
Convert a list of messages into an experience in async.