haive.core.models.llm.providers¶
LLM Providers Module.
This module contains provider-specific implementations for various Language Model providers supported by the Haive framework. Each provider is implemented in its own module with safe imports and proper error handling.
The module uses lazy imports to avoid requiring all provider dependencies to be installed. Only the providers actually used will trigger dependency checks.
- haive.core.models.llm.providers.BaseLLMProvider[source]¶
Base class for all LLM provider implementations.
- Parameters:
data (Any)
- haive.core.models.llm.providers.ProviderImportError[source]¶
Exception raised when provider dependencies are missing.
- haive.core.models.llm.providers.get_provider[source]¶
Function to get a provider class by enum value.
- Parameters:
- Return type:
- Available Providers:
OpenAI (GPT-3.5, GPT-4, etc.)
Anthropic (Claude models)
Google (Gemini, Vertex AI)
Azure OpenAI
AWS Bedrock
Mistral AI
Groq
Cohere
Together AI
Fireworks AI
Hugging Face
NVIDIA AI Endpoints
Ollama (local models)
Llama.cpp (local models)
And many more…
Examples
Safe import with error handling:
from haive.core.models.llm.providers import get_provider
from haive.core.models.llm.provider_types import LLMProvider
try:
provider_class = get_provider(LLMProvider.OPENAI)
provider = provider_class(model="gpt-4")
llm = provider.instantiate()
except ImportError as e:
print(f"Provider not available: {e}")
List available providers:
available_providers = list_providers()
print(f"Available LLM providers: {available_providers}")
Dynamic provider instantiation:
provider_name = "OpenAI"
if provider_name in list_providers():
provider_class = get_provider(LLMProvider.OPENAI)
llm = provider_class(model="gpt-4").instantiate()
Note
Provider classes are available via lazy loading through __getattr__. They are not included in __all__ to avoid AutoAPI import issues and ensure fast module initialization.
Submodules¶
- haive.core.models.llm.providers.ai21
- haive.core.models.llm.providers.anthropic
- haive.core.models.llm.providers.azure
- haive.core.models.llm.providers.base
- haive.core.models.llm.providers.bedrock
- haive.core.models.llm.providers.cohere
- haive.core.models.llm.providers.fireworks
- haive.core.models.llm.providers.google
- haive.core.models.llm.providers.groq
- haive.core.models.llm.providers.huggingface
- haive.core.models.llm.providers.mistral
- haive.core.models.llm.providers.nvidia
- haive.core.models.llm.providers.ollama
- haive.core.models.llm.providers.openai
- haive.core.models.llm.providers.replicate
- haive.core.models.llm.providers.together
- haive.core.models.llm.providers.xai
Functions¶
|
Get a provider class by enum value. |
List all available provider names. |
Package Contents¶
- haive.core.models.llm.providers.get_provider(provider)[source]¶
Get a provider class by enum value.
- Parameters:
provider (haive.core.models.llm.provider_types.LLMProvider) – The provider enum value
- Returns:
The provider class
- Raises:
ValueError – If provider is not supported
ImportError – If provider dependencies are not installed
- Return type: