haive.core.models.llm.providers

LLM Providers Module.

This module contains provider-specific implementations for various Language Model providers supported by the Haive framework. Each provider is implemented in its own module with safe imports and proper error handling.

The module uses lazy imports to avoid requiring all provider dependencies to be installed. Only the providers actually used will trigger dependency checks.

haive.core.models.llm.providers.BaseLLMProvider[source]

Base class for all LLM provider implementations.

Parameters:

data (Any)

haive.core.models.llm.providers.ProviderImportError[source]

Exception raised when provider dependencies are missing.

Parameters:
  • provider (str)

  • package (str)

  • message (str | None)

haive.core.models.llm.providers.get_provider[source]

Function to get a provider class by enum value.

Parameters:

provider (haive.core.models.llm.provider_types.LLMProvider)

Return type:

type[base.BaseLLMProvider]

haive.core.models.llm.providers.list_providers[source]

Function to list all available providers.

Return type:

list[str]

Available Providers:
  • OpenAI (GPT-3.5, GPT-4, etc.)

  • Anthropic (Claude models)

  • Google (Gemini, Vertex AI)

  • Azure OpenAI

  • AWS Bedrock

  • Mistral AI

  • Groq

  • Cohere

  • Together AI

  • Fireworks AI

  • Hugging Face

  • NVIDIA AI Endpoints

  • Ollama (local models)

  • Llama.cpp (local models)

  • And many more…

Examples

Safe import with error handling:

from haive.core.models.llm.providers import get_provider
from haive.core.models.llm.provider_types import LLMProvider

try:
    provider_class = get_provider(LLMProvider.OPENAI)
    provider = provider_class(model="gpt-4")
    llm = provider.instantiate()
except ImportError as e:
    print(f"Provider not available: {e}")

List available providers:

available_providers = list_providers()
print(f"Available LLM providers: {available_providers}")

Dynamic provider instantiation:

provider_name = "OpenAI"
if provider_name in list_providers():
    provider_class = get_provider(LLMProvider.OPENAI)
    llm = provider_class(model="gpt-4").instantiate()

Note

Provider classes are available via lazy loading through __getattr__. They are not included in __all__ to avoid AutoAPI import issues and ensure fast module initialization.

Submodules

Functions

get_provider(provider)

Get a provider class by enum value.

list_providers()

List all available provider names.

Package Contents

haive.core.models.llm.providers.get_provider(provider)[source]

Get a provider class by enum value.

Parameters:

provider (haive.core.models.llm.provider_types.LLMProvider) – The provider enum value

Returns:

The provider class

Raises:
Return type:

type[base.BaseLLMProvider]

haive.core.models.llm.providers.list_providers()[source]

List all available provider names.

Returns:

List of provider names that can be imported

Return type:

list[str]