agents.memory_v2.token_tracker¶
Token tracking component for memory operations.
Monitors token usage across memory operations and triggers summarization or rewriting when approaching context limits.
Classes¶
Token usage thresholds for different alert levels. |
|
Track token usage across memory operations with intelligent monitoring. |
|
Single token usage entry for tracking. |
Module Contents¶
- class agents.memory_v2.token_tracker.TokenThresholds(/, **data)¶
Bases:
pydantic.BaseModel
Token usage thresholds for different alert levels.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class agents.memory_v2.token_tracker.TokenTracker(/, **data)¶
Bases:
pydantic.BaseModel
Track token usage across memory operations with intelligent monitoring.
Features: - Real-time token tracking by operation - Threshold monitoring with alerts - Usage pattern analysis - Recommendations for optimization
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- can_fit_operation(estimated_tokens)¶
Check if an operation can fit within remaining tokens.
- estimate_tokens_for_content(content)¶
Estimate tokens for given content.
Simple estimation: ~4 characters per token (rough approximation). In production, would use proper tokenizer.
- get_recommendations()¶
Get recommendations based on usage patterns.
- get_usage_summary()¶
Get comprehensive usage summary.
- reset_tokens(keep_history=True)¶
Reset token counts while optionally keeping history.
- Parameters:
keep_history (bool) – Whether to keep usage history
- Return type:
None
- suggest_compression_targets(target_reduction=0.3)¶
Suggest operations to target for compression.
- class agents.memory_v2.token_tracker.TokenUsageEntry(/, **data)¶
Bases:
pydantic.BaseModel
Single token usage entry for tracking.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)