haive.core.models.metadata_mixin¶
Model metadata mixin for LLM configurations.
This module provides a mixin class that adds comprehensive model metadata access to LLM configuration classes, including context windows, pricing, and capability information.
Classes¶
Mixin to add comprehensive model metadata methods to LLMConfig classes. |
Module Contents¶
- class haive.core.models.metadata_mixin.ModelMetadataMixin[source]¶
Mixin to add comprehensive model metadata methods to LLMConfig classes.
This mixin provides access to model capabilities, context window sizes, pricing information, and other metadata from the model catalog.
- get_context_window()[source]¶
Get the maximum context window size for this model.
- Returns:
Total context window size (input + output tokens)
- Return type:
- get_deprecation_date()[source]¶
Get the deprecation date for this model, if available.
- Returns:
Deprecation date in YYYY-MM-DD format, or None if not deprecated
- Return type:
Optional[str]
- get_max_input_tokens()[source]¶
Get the maximum input tokens for this model.
- Returns:
Maximum input tokens the model can accept
- Return type:
- get_max_output_tokens()[source]¶
Get the maximum output tokens for this model.
- Returns:
Maximum output tokens the model can generate
- Return type:
- get_model_mode()[source]¶
Get the mode for this model.
- Returns:
Model mode (e.g., “chat”, “embedding”, “completion”)
- Return type:
- get_supported_endpoints()[source]¶
Get the supported API endpoints for this model.
- Returns:
List of supported endpoints
- Return type:
List[str]
- get_supported_modalities()[source]¶
Get the supported input modalities for this model.
- Returns:
List of supported modalities (e.g., “text”, “image”)
- Return type:
List[str]
- get_supported_output_modalities()[source]¶
Get the supported output modalities for this model.
- Returns:
List of supported output modalities
- Return type:
List[str]
- property supports_parallel_function_calling: bool¶
Check if model supports parallel function calling.
- Return type: