haive.core.models.llm.providers.anthropic¶
Anthropic Provider Module.
This module implements the Anthropic language model provider for the Haive framework, supporting Claude 3 models (Opus, Sonnet, Haiku) and earlier Claude versions.
The provider handles API key management, model configuration, and safe imports of the langchain-anthropic package dependencies.
Examples
Basic usage:
from haive.core.models.llm.providers.anthropic import AnthropicProvider
provider = AnthropicProvider(
model="claude-3-opus-20240229",
temperature=0.7,
max_tokens=4096
)
llm = provider.instantiate()
With streaming:
provider = AnthropicProvider(
model="claude-3-sonnet-20240229",
streaming=True
)
async for chunk in provider.instantiate().astream("Tell me a story"):
print(chunk.content, end="")
|
Anthropic language model provider configuration. |
Classes¶
Anthropic language model provider configuration. |
Module Contents¶
- class haive.core.models.llm.providers.anthropic.AnthropicProvider(/, **data)[source]¶
Bases:
haive.core.models.llm.providers.base.BaseLLMProvider
Anthropic language model provider configuration.
This provider supports all Anthropic Claude models including Claude 3 (Opus, Sonnet, Haiku) and Claude 2 variants. It provides access to Anthropic’s constitutional AI models with support for large context windows.
- Parameters:
data (Any)
requests_per_second (float | None)
tokens_per_second (int | None)
tokens_per_minute (int | None)
max_retries (int)
retry_delay (float)
check_every_n_seconds (float | None)
burst_size (int | None)
provider (LLMProvider)
model (str | None)
name (str | None)
api_key (SecretStr)
cache_enabled (bool)
cache_ttl (int | None)
debug (bool)
temperature (float | None)
max_tokens (int | None)
top_p (float | None)
top_k (int | None)
streaming (bool)
- provider¶
Always LLMProvider.ANTHROPIC
- model¶
Model name (default: “claude-3-sonnet-20240229”)
- temperature¶
Sampling temperature (0-1)
- max_tokens¶
Maximum tokens to generate (up to 4096)
- top_p¶
Nucleus sampling parameter
- top_k¶
Top-k sampling parameter
- streaming¶
Whether to stream responses
- Environment Variables:
ANTHROPIC_API_KEY: API key for authentication ANTHROPIC_MODEL: Default model to use
- Model Variants:
claude-3-opus-20240229: Most capable, best for complex tasks
claude-3-sonnet-20240229: Balanced performance and speed
claude-3-haiku-20240307: Fastest, most cost-effective
claude-2.1: Previous generation, 200K context
claude-2.0: Previous generation, 100K context
Examples
Using Claude 3 Opus:
provider = AnthropicProvider( model="claude-3-opus-20240229", temperature=0.5, max_tokens=4096 ) llm = provider.instantiate()
With custom sampling:
provider = AnthropicProvider( model="claude-3-sonnet-20240229", temperature=0.8, top_p=0.9, top_k=40 )
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.