ORBIT: Enterprise AI Gateway
Open Retrieval-Based Inference Toolkit
Connect 20+ LLM providers and enterprise data through one governed API.
ORBIT is a self-hosted gateway that eliminates vendor lock-in and integration glue code. It unifies LLMs, databases, APIs, and voice engines behind one OpenAI-compatible interface and MCP endpoint, letting teams standardize on one integration surface while keeping security, compliance, and operational controls consistent.
Try the Sandbox | API Reference | Docker Guide
orbit-demo.mp4
Star ORBIT on GitHub to follow new adapters, releases, and production features.
Officially backed by Schmitech, the ORBIT service provider for enterprise deployment and support.
⚡ Get Value in 60 Seconds
A) Try hosted API now
curl -X POST https://orbit.schmitech.ai/v1/chat \ -H 'Content-Type: application/json' \ -H 'X-API-Key: default-key' \ -H 'X-Session-ID: test-session' \ -d '{ "messages": [{"role": "user", "content": "What is ORBIT?"}], "stream": false }'
B) Run ORBIT locally with Docker Compose (recommended — includes Ollama)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker docker compose up -d # Wait for services to start, then test curl -X POST http://localhost:3000/v1/chat \ -H 'Content-Type: application/json' \ -H 'X-API-Key: default-key' \ -H 'X-Session-ID: local-test' \ -d '{ "messages": [{"role": "user", "content": "Summarize ORBIT in one sentence."}], "stream": false }'
For GPU acceleration (NVIDIA): docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d
C) Run ORBIT from the pre-built image (server only; point it at your own Ollama)
docker pull schmitech/orbit:basic docker run -d --name orbit-basic -p 3000:3000 schmitech/orbit:basic
If Ollama runs on your host (e.g. port 11434), add -e OLLAMA_HOST=host.docker.internal:11434 so the container can reach it. The image includes simple-chat only; for the full stack (Ollama + models), use option B or the Docker Guide.
🚀 Key Capabilities
- Unified API: Switch OpenAI, Anthropic, Gemini, Groq, or local models (Ollama/vLLM) by config.
- Agentic AI & MCP: Compatible with Model Context Protocol (MCP) for tool-enabled agent workflows.
- Native RAG: Connect Postgres, MongoDB, Elasticsearch, or Pinecone for natural-language data access.
- Voice-First: Real-time, full-duplex speech-to-speech with interruption handling via PersonaPlex.
- Governance Built In: RBAC, rate limiting, audit logging, and circuit breakers.
- Privacy First: Self-host on your own infrastructure for full data control.
🆚 Why Enterprise Teams Choose ORBIT
| If you use... | You often get... | ORBIT gives you... |
|---|---|---|
| Single-provider SDKs | Vendor lock-in and provider-specific rewrites | One OpenAI-compatible API across providers |
| Basic LLM proxy only | Model routing, but no data connectivity | Unified model + retrieval + tooling gateway |
| RAG-only framework | Strong retrieval, weak multi-provider inference control | Native RAG with multi-provider and policy controls |
| In-house glue scripts | Fragile integrations and high ops cost | A production-ready gateway with RBAC, limits, and logs |
🏢 Enterprise Readiness
- Deployment Flexibility: Run ORBIT in your own environment for strict data-boundary requirements.
- Operational Control: Standardize access, traffic policies, and audit trails behind one gateway.
- Architecture Fit: Integrates with existing data systems, identity patterns, and model providers.
- Service Backing: Schmitech provides enterprise onboarding, deployment support, and ongoing operations guidance.
🎯 Common Use Cases
- Enterprise RAG: Query SQL, NoSQL, and vector stores with one natural-language API.
- Provider Failover: Route between Ollama, vLLM, OpenAI, Anthropic, Gemini, Groq, etc. without rewrites.
- Voice Agents: Build full-duplex speech-to-speech experiences with interruption handling.
- MCP Tooling Layer: Expose data and actions to agentic apps through MCP compatibility.
🛠️ One Gateway, Many Clients
| Client | Link | Description |
|---|---|---|
| Web Chat | ORBIT Chat | React UI. |
| CLI | pip install schmitech-orbit-client |
Chat directly from your terminal. |
| Mobile | ORBIT Mobile | iOS & Android app built with Expo. |
| SDKs | Node SDK | Or use any standard OpenAI-compatible SDK. |
📦 Deployment
Docker Compose (Fastest Path)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker docker compose up -d
This starts ORBIT + Ollama with SmolLM2, auto-pulls models, and exposes the API on port 3000. Connect orbitchat from your host: ORBIT_ADAPTER_KEYS='{"simple-chat":"default-key"}' npx orbitchat
Pre-built image only (server + your own Ollama): docker pull schmitech/orbit:basic then docker run -d --name orbit-basic -p 3000:3000 -e OLLAMA_HOST=host.docker.internal:11434 schmitech/orbit:basic if Ollama runs on the host.
See the full Docker Guide for GPU mode, volumes, single-container run, and configuration.
Stable Release (Recommended for Production)
curl -L https://github.com/schmitech/orbit/releases/download/v2.6.0/orbit-2.6.0.tar.gz -o orbit-2.6.0.tar.gz tar -xzf orbit-2.6.0.tar.gz && cd orbit-2.6.0 cp env.example .env && ./install/setup.sh source venv/bin/activate ./bin/orbit.sh start && cat ./logs/orbit.log
📈 Project Momentum
- Frequent releases: Releases
- Active roadmap and Q&A: Discussions
- Feature requests and bugs: Issues
- Technical writeups: Articles & Case Studies
- Enterprise services: Official ORBIT provider (Schmitech)
🧩 Supported Integrations
Inference: OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, Mistral, AWS Bedrock, Azure, Together, Ollama, vLLM, llama.cpp.
Data Adapters: PostgreSQL, MySQL, MongoDB, Elasticsearch, DuckDB, Chroma, Qdrant, Pinecone, Milvus, Weaviate.
📚 Resources & Support
- Step-by-Step Tutorial – Learn how to chat with your own data in minutes.
- Articles & Case Studies – Deep dives into configuration and real-world use cases.
- Documentation – Full architecture and setup guides.
- GitHub Issues – Bug reports and feature requests.
- Discussions – Community help and roadmap.
- Enterprise Services – Backed by Schmitech for onboarding, deployment, and production support.
- Good First Issues – Starter tasks for new contributors.
- Help Wanted – High-impact tasks where contributions are needed.
⭐ Help ORBIT grow: Star the repo to support the project and get notified of new adapters!
📄 License
Apache 2.0 – see LICENSE.