A Rust SDK for building AI agents with multi-provider LLM support.
Features
- Multi-provider LLM support: OpenAI, Anthropic, Google Gemini, and OpenAI-compatible providers
- Tool/Function calling: Built-in tool system — define tools with a simple
#[tool]macro - Streaming responses: Event-based real-time response handling
- Context compaction: Automatic management of long conversation context
- Token tracking: Usage tracking and cost calculation across providers
- Retry mechanism: Built-in exponential backoff retry for rate limit handling
- Memory system: In-memory memory by default, with optional LanceDB persistence via feature flag
Installation
[dependencies] agent-io = { version = "0.3", features = ["openai"] } tokio = { version = "1", features = ["full"] }
Enable memory-lancedb only if you need persistent vector-backed memory:
agent-io = { version = "0.3", features = ["openai", "memory-lancedb"] }
Quick Start
Basic Usage
use std::sync::Arc; use agent_io::{Agent, llm::ChatOpenAI}; #[tokio::main] async fn main() -> Result<(), Box<dyn std::error::Error>> { let llm = ChatOpenAI::new("gpt-4o")?; let agent = Agent::builder() .with_llm(Arc::new(llm)) .build()?; let response = agent.query("Hello!").await?; println!("{}", response); Ok(()) }
Defining Tools with #[tool]
The #[tool] macro eliminates boilerplate — just write a plain async fn:
use std::sync::Arc; use agent_io::{Agent, llm::ChatOpenAI, tool}; /// Get the current weather for a city #[tool(location = "The city name to fetch weather for")] async fn get_weather(location: String) -> agent_io::Result<String> { Ok(format!("Weather in {location}: Sunny, 25°C")) } /// Evaluate a simple arithmetic expression #[tool(expression = "The expression to evaluate, e.g. '15 * 7'")] async fn calculator(expression: String) -> agent_io::Result<String> { // ... implementation Ok("Result: 105".to_string()) } #[tokio::main] async fn main() -> Result<(), Box<dyn std::error::Error>> { let llm = ChatOpenAI::new("gpt-5.4")?; let agent = Agent::builder() .with_llm(Arc::new(llm)) .tool(get_weather()) // macro generates Arc<dyn Tool> constructor .tool(calculator()) .system_prompt("You are a helpful assistant.") .build()?; let response = agent.query("What's the weather in Tokyo and 15 * 7?").await?; println!("{}", response); Ok(()) }
The macro automatically:
- Uses the function doc comment as the tool description
- Maps Rust types to JSON Schema types (
String→"string",f64→"number", etc.) - Uses the attribute key=value pairs as parameter descriptions
- Generates a zero-arg constructor returning
Arc<dyn Tool>
Manual Tool Definition
For more control, use FunctionTool or ToolBuilder directly:
use std::sync::Arc; use agent_io::tools::{ToolBuilder, Tool}; use serde::Deserialize; #[derive(Deserialize)] struct WeatherArgs { location: String } let tool: Arc<dyn Tool> = ToolBuilder::new("get_weather") .description("Get weather for a location") .string_param("location", "The city name") .build(|args: WeatherArgs| Box::pin(async move { Ok(format!("Sunny in {}", args.location)) }));
Supported Providers
| Provider | Type | Environment Variable |
|---|---|---|
| OpenAI | ChatOpenAI |
OPENAI_API_KEY |
| Anthropic | ChatAnthropic |
ANTHROPIC_API_KEY |
| Google Gemini | ChatGoogle |
GOOGLE_API_KEY |
| DeepSeek | ChatDeepSeek |
DEEPSEEK_API_KEY |
| Groq | ChatGroq |
GROQ_API_KEY |
| Mistral | ChatMistral |
MISTRAL_API_KEY |
| Ollama | ChatOllama |
— (local) |
| OpenRouter | ChatOpenRouter |
OPENROUTER_API_KEY |
| OpenAI-compatible | ChatOpenAICompatible |
configurable |
Feature Flags
[dependencies.agent-io] version = "0.3" features = ["openai", "anthropic", "google"] # Optional persistent memory backend # features = ["openai", "memory-lancedb"] # Or enable the bundled provider set: # features = ["full"]
Examples
# Basic example (manual tool definition) cargo run --example basic # Macro-based tools (zero boilerplate) cargo run --example macro_tools # Multi-provider cargo run --example multi_provider --features full
Workspace
This repository is a Cargo workspace:
agent-io/ # main SDK crate
agent-io-macros/ # proc-macro crate (#[tool])
License
Licensed under the Apache License 2.0.