haive.core.engine.retriever.providers.LlamaIndexRetrieverConfig¶
LlamaIndex Retriever implementation for the Haive framework.
from typing import Any This module provides a configuration class for the LlamaIndex retriever, which integrates LlamaIndex’s retrieval capabilities with LangChain. LlamaIndex provides a data framework for LLM applications with sophisticated indexing and retrieval mechanisms.
The LlamaIndexRetriever works by: 1. Using LlamaIndex’s retrieval engines 2. Supporting various index types (vector, keyword, graph, etc.) 3. Enabling sophisticated query processing 4. Providing LlamaIndex-specific optimizations
This retriever is particularly useful when: - Integrating LlamaIndex with LangChain workflows - Need LlamaIndex’s advanced indexing capabilities - Want to leverage LlamaIndex’s query engines - Building complex retrieval pipelines - Using LlamaIndex’s data connectors
The implementation integrates LlamaIndex retrievers with LangChain while providing a consistent Haive configuration interface.
Classes¶
Configuration for LlamaIndex retriever in the Haive framework. |
Module Contents¶
- class haive.core.engine.retriever.providers.LlamaIndexRetrieverConfig.LlamaIndexRetrieverConfig[source]¶
Bases:
haive.core.engine.retriever.retriever.BaseRetrieverConfig
Configuration for LlamaIndex retriever in the Haive framework.
This retriever integrates LlamaIndex’s retrieval capabilities with LangChain, enabling the use of LlamaIndex’s sophisticated indexing and query mechanisms.
- retriever_type¶
The type of retriever (always LLAMA_INDEX).
- Type:
- documents¶
Documents to index (if not loading from path).
- Type:
List[Document]
Examples
>>> from haive.core.engine.retriever import LlamaIndexRetrieverConfig >>> from langchain_core.documents import Document >>> >>> # Create documents >>> docs = [ ... Document(page_content="LlamaIndex provides data framework for LLMs"), ... Document(page_content="Vector stores enable semantic search"), ... Document(page_content="Graph indexes capture relationships") ... ] >>> >>> # Create the LlamaIndex retriever config >>> config = LlamaIndexRetrieverConfig( ... name="llamaindex_retriever", ... documents=docs, ... k=5, ... index_type="vector", ... similarity_top_k=10 ... ) >>> >>> # Instantiate and use the retriever >>> retriever = config.instantiate() >>> docs = retriever.get_relevant_documents("semantic search with vectors") >>> >>> # Example with graph index >>> graph_config = LlamaIndexRetrieverConfig( ... name="llamaindex_graph_retriever", ... documents=docs, ... index_type="knowledge_graph", ... k=3 ... )
- instantiate()[source]¶
Create a LlamaIndex retriever from this configuration.
- Returns:
Instantiated retriever ready for LlamaIndex-powered search.
- Return type:
LlamaIndexRetriever
- Raises:
ImportError – If required packages are not available.
ValueError – If configuration is invalid.