trinity.common.models package#
Subpackages#
Submodules#
- trinity.common.models.mm_utils module
- trinity.common.models.model module
InferenceModelInferenceModel.__init__()InferenceModel.generate()InferenceModel.chat()InferenceModel.logprobs()InferenceModel.convert_messages_to_experience()InferenceModel.prepare()InferenceModel.sync_model()InferenceModel.get_model_version()InferenceModel.get_available_address()InferenceModel.get_api_server_url()InferenceModel.get_api_key()InferenceModel.get_model_config()InferenceModel.get_model_path()InferenceModel.shutdown()
BaseInferenceModelModelWrapperModelWrapper.__init__()ModelWrapper.prepare()ModelWrapper.generate()ModelWrapper.generate_async()ModelWrapper.chat()ModelWrapper.chat_async()ModelWrapper.logprobs()ModelWrapper.logprobs_async()ModelWrapper.convert_messages_to_experience()ModelWrapper.convert_messages_to_experience_async()ModelWrapper.api_keyModelWrapper.model_versionModelWrapper.model_version_asyncModelWrapper.model_pathModelWrapper.model_path_asyncModelWrapper.model_nameModelWrapper.model_configModelWrapper.generate_kwargsModelWrapper.get_lora_request()ModelWrapper.get_lora_request_async()ModelWrapper.get_message_token_len()ModelWrapper.get_openai_client()ModelWrapper.get_openai_async_client()ModelWrapper.get_current_load()ModelWrapper.sync_model_weights()ModelWrapper.extract_experience_from_history()ModelWrapper.set_workflow_state()ModelWrapper.clean_workflow_state()ModelWrapper.get_workflow_state()ModelWrapper.clone_with_isolated_history()
convert_api_output_to_experience()HistoryRecordingStreamextract_logprobs()
- trinity.common.models.tinker_model module
- trinity.common.models.utils module
- trinity.common.models.vllm_model module
- trinity.common.models.vllm_worker module
Module contents#
- trinity.common.models.create_explorer_models(config: Config) Tuple[List, List[List]][源代码]#
Create rollout_models and auxiliary_models.
- 参数:
config -- The trinity configuration.
- 返回:
The rollout_models and auxiliary_models.
- 返回类型:
Tuple[List, List[List]]
- trinity.common.models.create_vllm_inference_models(config: InferenceModelConfig, allocator: _BundleAllocator, actor_name: str) List[源代码]#
- async trinity.common.models.create_debug_explorer_model(config: Config) None[源代码]#
Create explorer inference models for debugging.
- trinity.common.models.get_debug_explorer_model(config: Config) Tuple[InferenceModel, List[InferenceModel]][源代码]#
Get the inference models for debugging. The models must be created by create_debug_explorer_model in another process first.
- async trinity.common.models.get_auxiliary_model_wrappers(config: Config) Dict[str | int, List[ModelWrapper]][源代码]#
Get auxiliary models.
- 返回:
- A dictionary mapping auxiliary model
index to a list of auxiliary model actor handlers.
- 返回类型:
Dict[str| int, List[ModelWrapper]]