haive.core.engine.embeddingΒΆ

Embedding engine module for the Haive framework.

This module provides comprehensive embedding functionality with support for multiple providers including OpenAI, Azure OpenAI, HuggingFace, Cohere, Google Vertex AI, Ollama, and more.

The module follows a factory pattern where embedding configurations are created through provider-specific config classes, then instantiated to create the actual embedding instances. All providers are lazily loaded to minimize startup time.

haive.core.engine.embedding.BaseEmbeddingConfig[source]ΒΆ

Base class for all embedding configurations.

haive.core.engine.embedding.EmbeddingConfigFactory[source]ΒΆ

Factory for creating embedding configurations.

haive.core.engine.embedding.EmbeddingType[source]ΒΆ

Enum of supported embedding providers.

haive.core.engine.embedding.create_embedding_config[source]ΒΆ

Factory function for creating configurations.

Parameters:
Return type:

haive.core.engine.embedding.base.BaseEmbeddingConfig

haive.core.engine.embedding.providersΒΆ

Lazily loaded providers module.

Examples

Basic usage with OpenAI embeddings:

from haive.core.engine.embedding import BaseEmbeddingConfig
from haive.core.engine.embedding.providers import OpenAIEmbeddingConfig

# Create configuration
config = OpenAIEmbeddingConfig(
    name="my_embeddings",
    model="text-embedding-3-large"
)

# Instantiate embeddings
embeddings = config.instantiate()

# Embed text
vectors = embeddings.embed_documents(["Hello world", "How are you?"])
query_vector = embeddings.embed_query("Hello")

Configuration discovery and dynamic provider selection:

from haive.core.engine.embedding import BaseEmbeddingConfig

# List all registered providers
available_providers = BaseEmbeddingConfig.list_registered_types()
print(f"Available providers: {available_providers}")

# Get specific provider class
provider_class = BaseEmbeddingConfig.get_config_class("OpenAI")
config = provider_class(name="dynamic_embeddings")

Using the factory function for simplified creation:

from haive.core.engine.embedding import create_embedding_config

config = create_embedding_config(
    provider="OpenAI",
    model="text-embedding-3-large",
    name="my_embeddings"
)

embeddings = config.instantiate()

Note

All provider classes are lazily loaded through the providers attribute to avoid import overhead during module initialization. This allows the framework to start quickly even when many providers are available.

SubmodulesΒΆ