Getting Startedยถ
This guide will help you set up and run your first HAP (Haive Agent Protocol) workflow.
Prerequisitesยถ
Before starting, ensure you have:
Python 3.12 or higher
Poetry for dependency management
Access to an LLM provider (OpenAI, Anthropic, etc.)
Installationยถ
Install the haive-hap package:
cd packages/haive-hap
poetry install
Set up your environment variables:
# Example for OpenAI
export OPENAI_API_KEY="your-api-key-here"
# Or create a .env file
echo "OPENAI_API_KEY=your-api-key-here" > .env
First HAP Workflowยถ
Letโs create a simple workflow with a single agent:
"""Simple HAP workflow example."""
import asyncio
from haive.hap.models import HAPGraph
from haive.hap.server.runtime import HAPRuntime
from haive.agents.simple.agent import SimpleAgent
from haive.core.engine.aug_llm import AugLLMConfig
async def main():
# Create an agent
agent = SimpleAgent(
name="helpful_assistant",
engine=AugLLMConfig(
temperature=0.7,
system_message="You are a helpful assistant that provides clear, concise answers."
)
)
# Create a graph
graph = HAPGraph()
graph.add_agent_node("assistant", agent)
graph.entry_node = "assistant"
# Create runtime and execute
runtime = HAPRuntime(graph)
result = await runtime.run({
"messages": [{
"role": "user",
"content": "What is the Haive Agent Protocol?"
}]
})
print("๐ HAP Workflow Complete!")
print(f"๐ Execution Path: {result.execution_path}")
print(f"๐ฌ Response: {result.outputs}")
if __name__ == "__main__":
asyncio.run(main())
Multi-Agent Workflowยถ
Now letโs create a more complex workflow with multiple agents:
"""Multi-agent HAP workflow example."""
import asyncio
from haive.hap.models import HAPGraph
from haive.hap.server.runtime import HAPRuntime
from haive.agents.simple.agent import SimpleAgent
from haive.core.engine.aug_llm import AugLLMConfig
async def multi_agent_workflow():
# Create specialized agents
analyzer = SimpleAgent(
name="data_analyzer",
engine=AugLLMConfig(
temperature=0.3,
system_message="You are a data analyst. Analyze the given data and provide insights."
)
)
summarizer = SimpleAgent(
name="summarizer",
engine=AugLLMConfig(
temperature=0.5,
system_message="You create clear, concise summaries of analysis results."
)
)
# Build the workflow graph
graph = HAPGraph()
graph.add_agent_node("analyze", analyzer, next_nodes=["summarize"])
graph.add_agent_node("summarize", summarizer)
graph.entry_node = "analyze"
# Execute the workflow
runtime = HAPRuntime(graph)
result = await runtime.run({
"data": "Sales data: Q1: $100k, Q2: $120k, Q3: $140k, Q4: $160k",
"task": "Analyze this sales data and provide a summary"
})
print("๐ Multi-Agent Workflow Complete!")
print(f"๐ Path: {' โ '.join(result.execution_path)}")
print(f"๐ Final Summary: {result.outputs}")
if __name__ == "__main__":
asyncio.run(multi_agent_workflow())
Using Agent Entrypointsยถ
You can also define agents using entrypoint strings for more flexibility:
"""Agent entrypoint example."""
from haive.hap.models import HAPGraph
from haive.hap.server.runtime import HAPRuntime
# Create graph using entrypoints
graph = HAPGraph()
graph.add_entrypoint_node(
"processor",
"haive.agents.simple:SimpleAgent",
next_nodes=["validator"]
)
graph.add_entrypoint_node(
"validator",
"haive.agents.simple:SimpleAgent"
)
graph.entry_node = "processor"
# Runtime will load agents dynamically
runtime = HAPRuntime(graph)
Working with Contextยถ
HAP maintains execution context throughout the workflow:
"""Context tracking example."""
import asyncio
from haive.hap.models import HAPGraph, HAPContext
from haive.hap.server.runtime import HAPRuntime
async def context_example():
# Create initial context
context = HAPContext()
context.execution_path = []
context.agent_metadata = {}
# Build graph
graph = HAPGraph()
# ... add nodes ...
# Execute with context
runtime = HAPRuntime(graph)
result = await runtime.run({"input": "data"}, context=context)
# Access execution metadata
for node_id in result.execution_path:
metadata = result.agent_metadata.get(node_id, {})
print(f"Node {node_id}: {metadata}")
Error Handlingยถ
HAP provides comprehensive error handling:
"""Error handling example."""
import asyncio
from haive.hap.server.runtime import HAPRuntime
async def safe_execution():
try:
runtime = HAPRuntime(graph)
result = await runtime.run(input_data)
except ImportError as e:
print(f"โ Agent import failed: {e}")
except RuntimeError as e:
print(f"โ Execution failed: {e}")
except Exception as e:
print(f"โ Unexpected error: {e}")
else:
print("โ
Execution successful!")
Next Stepsยถ
Now that you have HAP running:
Explore Examples: Check out Examples for more complex workflows
Learn the Models: Read HAP Models to understand HAP data structures
Runtime Details: Study HAP Server for advanced runtime features
Protocol Integration: See HAP Protocol for JSON-RPC protocol usage
Common Patternsยถ
Sequential Processing:
# Data flows: Agent A โ Agent B โ Agent C
graph.add_agent_node("step1", agent1, ["step2"])
graph.add_agent_node("step2", agent2, ["step3"])
graph.add_agent_node("step3", agent3)
Parallel Processing:
# Data flows: Agent A โ [Agent B, Agent C] โ Agent D
graph.add_agent_node("splitter", splitter, ["worker1", "worker2"])
graph.add_agent_node("worker1", worker1, ["joiner"])
graph.add_agent_node("worker2", worker2, ["joiner"])
graph.add_agent_node("joiner", joiner)
Conditional Routing:
# Agent A decides which path to take
graph.add_agent_node("router", router, ["path_a", "path_b"])
graph.add_agent_node("path_a", specialist_a)
graph.add_agent_node("path_b", specialist_b)
Troubleshootingยถ
Common Issues:
Import Errors: Ensure all required packages are installed with
poetry install
Agent Loading: Check that agent entrypoints are correctly formatted
API Keys: Verify environment variables are set correctly
Context Issues: Make sure context flows properly between nodes
Debug Mode:
# Enable debug logging
import logging
logging.basicConfig(level=logging.DEBUG)
# Execute with debug info
result = await runtime.run(data, debug=True)
Testing:
# Run HAP tests
poetry run pytest tests/ -v
# Test specific functionality
poetry run pytest tests/test_hap_runtime.py -v
Ready to build more complex workflows? Check out our Examples and tutorials/index sections!