The official Python SDK for the Hanzo AI platform, providing unified access to 100+ LLM providers through a single OpenAI-compatible API interface.
π Features
- Unified API: Single interface for 100+ LLM providers (OpenAI, Anthropic, Google, Meta, etc.)
- OpenAI Compatible: Drop-in replacement for OpenAI SDK
- Enterprise Features: Cost tracking, rate limiting, observability
- Local AI Support: Run models locally with node infrastructure
- Model Context Protocol (MCP): Advanced tool use and context management
- Agent Framework: Build and orchestrate AI agents
- Memory Management: Persistent memory and RAG capabilities
- Network Orchestration: Distributed AI compute capabilities
π¦ Installation
Basic Installation
Full Installation (All Features)
pip install "hanzoai[all]"Development Installation
git clone https://github.com/hanzoai/python-sdk.git
cd python-sdk
make setupπ― Quick Start
Basic Usage
from hanzoai import Hanzo # Initialize client client = Hanzo(api_key="your-api-key") # Chat completion response = client.chat.completions.create( model="gpt-4", messages=[{"role": "user", "content": "Hello!"}] ) print(response.choices[0].message.content)
Using Different Providers
# Use Claude response = client.chat.completions.create( model="claude-3-opus-20240229", messages=[{"role": "user", "content": "Hello!"}] ) # Use local models response = client.chat.completions.create( model="llama2:7b", messages=[{"role": "user", "content": "Hello!"}] )
ποΈ Architecture
Package Structure
python-sdk/
βββ pkg/
β βββ hanzo/ # CLI and orchestration tools
β βββ hanzo-mcp/ # Model Context Protocol implementation
β βββ hanzo-agents/ # Agent framework
β βββ hanzo-network/ # Distributed network capabilities
β βββ hanzo-memory/ # Memory and RAG
β βββ hanzo-aci/ # AI code intelligence
β βββ hanzo-repl/ # Interactive REPL
β βββ hanzoai/ # Core SDK
Core Components
1. Hanzo CLI (hanzo)
Command-line interface for AI operations:
# Chat with AI hanzo chat # Start local node hanzo node start # Manage router hanzo router start # Interactive REPL hanzo repl
2. Model Context Protocol (hanzo-mcp)
Advanced tool use and context management:
from hanzo_mcp import create_mcp_server server = create_mcp_server() server.register_tool(my_tool) server.start()
3. Agent Framework (hanzo-agents)
Build and orchestrate AI agents:
from hanzo_agents import Agent, Swarm agent = Agent( name="researcher", model="gpt-4", instructions="You are a research assistant" ) swarm = Swarm([agent]) result = await swarm.run("Research quantum computing")
4. Network Orchestration (hanzo-network)
Distributed AI compute:
from hanzo_network import LocalComputeNode, DistributedNetwork node = LocalComputeNode(node_id="node-001") network = DistributedNetwork() network.register_node(node)
5. Memory Management (hanzo-memory)
Persistent memory and RAG:
from hanzo_memory import MemoryService memory = MemoryService() await memory.store("key", "value") result = await memory.retrieve("key")
π οΈ Development
Setup Development Environment
# Install Python 3.10+ make install-python # Setup virtual environment make setup # Install development dependencies make dev
Running Tests
# Run all tests make test # Run specific package tests make test-hanzo make test-mcp make test-agents # Run with coverage make test-coverage
Code Quality
# Format code make format # Run linting make lint # Type checking make type-check
Building Packages
# Build all packages make build # Build specific package cd pkg/hanzo && uv build
π Documentation
Package Documentation
- Hanzo CLI Documentation
- MCP Documentation
- Agents Documentation
- Network Documentation
- Memory Documentation
API Reference
See the API documentation for detailed API reference.
π§ Configuration
Environment Variables
# API Configuration HANZO_API_KEY=your-api-key HANZO_BASE_URL=https://api.hanzo.ai # Router Configuration HANZO_ROUTER_URL=http://localhost:4000/v1 # Node Configuration HANZO_NODE_URL=http://localhost:8000/v1 # Logging HANZO_LOG_LEVEL=INFO
Configuration File
Create ~/.hanzo/config.yaml:
api: key: your-api-key base_url: https://api.hanzo.ai router: url: http://localhost:4000/v1 node: url: http://localhost:8000/v1 workers: 4 logging: level: INFO
π’ Deployment
Docker
# Build image docker build -t hanzo-sdk . # Run container docker run -p 8000:8000 hanzo-sdk
Docker Compose
# Start all services docker-compose up # Start specific service docker-compose up router
π€ Contributing
We welcome contributions! Please see our Contributing Guide for details.
Development Workflow
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature) - Make changes and test
- Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open Pull Request
Code Standards
- Follow PEP 8
- Use type hints
- Write tests for new features
- Update documentation
- Run
make lintbefore committing
π Performance
Benchmarks
| Operation | Latency | Throughput |
|---|---|---|
| Chat Completion | 50ms | 20 req/s |
| Embedding | 10ms | 100 req/s |
| Local Inference | 200ms | 5 req/s |
Optimization Tips
- Use streaming for long responses
- Enable caching for repeated queries
- Use batch operations when possible
- Configure appropriate timeouts
π Security
- API keys are encrypted at rest
- All communications use TLS 1.3+
- Regular security audits
- SOC 2 Type II certified
Report security issues to security@hanzo.ai
π License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
π Acknowledgments
- OpenAI for the API specification
- Anthropic for Claude integration
- The open-source community
π Support
- Documentation: https://docs.hanzo.ai
- Discord: https://discord.gg/hanzo
- Email: support@hanzo.ai
- GitHub Issues: https://github.com/hanzoai/python-sdk/issues
πΊοΈ Roadmap
- Multi-modal support (images, audio, video)
- Enhanced caching strategies
- WebSocket streaming
- Browser SDK
- Mobile SDKs (iOS, Android)
Built with β€οΈ by the Hanzo team