haive.core.models.llm ===================== .. py:module:: haive.core.models.llm .. autoapi-nested-parse:: Haive LLM Module. This module provides comprehensive abstractions and implementations for working with Large Language Models (LLMs) from various providers. It includes configuration classes, provider-specific implementations, and utilities for model metadata. The module supports a wide range of LLM providers including OpenAI, Anthropic, Google, Azure, Mistral, and many others, with a consistent interface for configuration and use. Key Components: - Base Classes: Abstract base classes for LLM configurations - Provider Types: Enumeration of supported LLM providers - Provider Implementations: Provider-specific configuration classes - Metadata: Utilities for accessing model capabilities and context windows Typical usage example: .. rubric:: Examples >>> from haive.core.models.llm.base import OpenAILLMConfig >>> >>> # Configure an LLM >>> config = OpenAILLMConfig( >>> model="gpt-4", >>> cache_enabled=True >>> ) >>> >>> # Instantiate the LLM >>> llm = config.instantiate() >>> >>> # Generate text >>> response = llm.generate("Explain quantum computing") Submodules ---------- .. toctree:: :maxdepth: 1 /autoapi/haive/core/models/llm/base/index /autoapi/haive/core/models/llm/export_llm_models_to_csv/index /autoapi/haive/core/models/llm/factory/index /autoapi/haive/core/models/llm/provider_types/index /autoapi/haive/core/models/llm/providers/index /autoapi/haive/core/models/llm/rate_limiting_mixin/index