haive.core.models.llm.providers.bedrock

AWS Bedrock Provider Module.

This module implements the AWS Bedrock language model provider for the Haive framework, supporting Amazon’s managed LLM service with models from Anthropic, AI21, Cohere, and others.

The provider handles AWS credentials, region configuration, and safe imports of the langchain-aws package dependencies.

Examples

Basic usage:

from haive.core.models.llm.providers.bedrock import BedrockProvider

provider = BedrockProvider(
    model="anthropic.claude-3-sonnet-20240229-v1:0",
    region_name="us-east-1",
    temperature=0.7,
    max_tokens=1000
)
llm = provider.instantiate()

With custom AWS configuration:

provider = BedrockProvider(
    model="ai21.j2-ultra-v1",
    region_name="us-west-2",
    aws_access_key_id="...",
    aws_secret_access_key="...",
    temperature=0.1
)

BedrockProvider(*[, requests_per_second, ...])

AWS Bedrock language model provider configuration.

Classes

BedrockProvider

AWS Bedrock language model provider configuration.

Module Contents

class haive.core.models.llm.providers.bedrock.BedrockProvider(/, **data)[source]

Bases: haive.core.models.llm.providers.base.BaseLLMProvider

AWS Bedrock language model provider configuration.

This provider supports AWS Bedrock’s managed LLM service including models from Anthropic Claude, AI21 Jurassic, Cohere Command, and Amazon Titan.

Parameters:
  • data (Any)

  • requests_per_second (float | None)

  • tokens_per_second (int | None)

  • tokens_per_minute (int | None)

  • max_retries (int)

  • retry_delay (float)

  • check_every_n_seconds (float | None)

  • burst_size (int | None)

  • provider (LLMProvider)

  • model (str | None)

  • name (str | None)

  • api_key (SecretStr)

  • cache_enabled (bool)

  • cache_ttl (int | None)

  • extra_params (dict[str, Any] | None)

  • debug (bool)

  • region_name (str)

  • aws_access_key_id (str | None)

  • aws_secret_access_key (str | None)

  • aws_session_token (str | None)

  • temperature (float | None)

  • max_tokens (int | None)

  • top_p (float | None)

provider

Always LLMProvider.BEDROCK

Type:

LLMProvider

model

The Bedrock model ID to use

Type:

str

region_name

AWS region for Bedrock service

Type:

str

aws_access_key_id

AWS access key ID

Type:

str

aws_secret_access_key

AWS secret access key

Type:

str

aws_session_token

AWS session token (for temporary credentials)

Type:

str

temperature

Sampling temperature (0.0-1.0)

Type:

float

max_tokens

Maximum tokens in response

Type:

int

top_p

Nucleus sampling parameter

Type:

float

Examples

Claude 3 on Bedrock:

provider = BedrockProvider(
    model="anthropic.claude-3-sonnet-20240229-v1:0",
    region_name="us-east-1",
    temperature=0.7
)

AI21 Jurassic model:

provider = BedrockProvider(
    model="ai21.j2-ultra-v1",
    region_name="us-west-2",
    temperature=0.1,
    max_tokens=2000
)

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

classmethod get_models()[source]

Get available Bedrock models.

Return type:

list[str]

classmethod validate_model_id(v)[source]

Validate Bedrock model ID format.

Parameters:

v (str)

Return type:

str

max_tokens: int | None = None

Get maximum total tokens for this model.