GitHub - schmitech/orbit: An adaptable, open-source context-aware inference engine designed for privacy, control, and independence from proprietary models.

ORBIT Logo

One API for 20+ LLM providers, your databases, and your files.

Self-hosted. Open-source. Production-ready.

License Python Latest release Last commit GitHub stars

Live Sandbox  |  API Reference  |  Docker Guide  |  Cookbook


Get running in 60 seconds

git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d

Then test it:

curl -X POST http://localhost:3000/v1/chat \
  -H 'Content-Type: application/json' \
  -H 'X-API-Key: default-key' \
  -H 'X-Session-ID: local-test' \
  -d '{
    "messages": [{"role": "user", "content": "Summarize ORBIT in one sentence."}],
    "stream": false
  }'

That's it. ORBIT is listening on port 3000 with an admin panel at localhost:3000/admin (default login: admin / admin123).

For GPU acceleration: docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d


What can you build with ORBIT?

  • Ask your database questions in any language — Connect Postgres, MySQL, MongoDB, DuckDB, or Elasticsearch and query them with natural language. Built-in language detection responds in the user's language automatically.
  • Switch LLM providers without changing code — Swap between OpenAI, Anthropic, Gemini, Groq, Ollama, vLLM, and more with a config change.
  • Build voice agents — Full-duplex speech-to-speech with interruption handling via PersonaPlex.
  • Power agentic workflows — MCP-compatible, so AI agents can use ORBIT as a tool.
  • Upload files and get answers — RAG over PDFs, images, and documents out of the box.
  • Add guardrails and content moderation — Built-in safety layer with OpenAI, Anthropic, or local (Llama Guard) moderators to filter harmful content before it reaches users.
  • Go from text to speech and back — Plug in STT (Whisper, Google, Gemini) and TTS (OpenAI, ElevenLabs, Coqui) providers for voice-enabled applications.
  • Keep everything private — Self-host on your own infrastructure with RBAC, rate limiting, and audit logging.

Supported integrations

LLM Providers: OpenAI, Anthropic, Google Gemini, Cohere, Groq, DeepSeek, Mistral, AWS Bedrock, Azure, Together, Ollama, vLLM, llama.cpp

Data Sources: PostgreSQL, MySQL, MongoDB, Elasticsearch, DuckDB, SQLite, HTTP/REST APIs, GraphQL

Vector Stores: Chroma, Qdrant, Pinecone, Milvus, Weaviate


Why ORBIT?

Without ORBIT With ORBIT
One SDK per provider, rewrites when you switch One OpenAI-compatible API across all providers
Separate pipelines for retrieval and inference Unified model + retrieval + tooling gateway
Fragile glue scripts between data sources and LLMs Production-ready connectors with policy controls
No visibility into what models are doing Built-in RBAC, rate limiting, and audit logging

Try it live

The public sandbox hosts one chat workspace per adapter. Pick a demo to see ORBIT in action:

Demo Data Source Try it
HR Database SQLite intent-sql-sqlite-hr
EV Population Stats DuckDB intent-duckdb-ev-population
SpaceX GraphQL GraphQL intent-graphql-spacex
File Upload Chat Files chat-with-files
All sandbox demos
Demo Data Source Try it
HR Database SQLite intent-sql-sqlite-hr
DuckDB Analytics DuckDB intent-duckdb-analytics
EV Population Stats DuckDB intent-duckdb-ev-population
JSONPlaceholder REST HTTP (JSON) intent-http-jsonplaceholder
Paris Open Data HTTP (JSON) intent-http-paris-opendata
MFlix Movies MongoDB intent-mongodb-mflix
SpaceX GraphQL GraphQL intent-graphql-spacex
Simple Chat LLM simple-chat
File Upload Chat Files chat-with-files

Built with ORBIT

  • PoliceStats.ca — Public chat over Canadian municipal police open data. Users ask about auto theft, break-ins, crime by neighbourhood, and cross-city comparisons.

Using ORBIT in production? Let us know and we'll add your project here.


Clients

Client Description
Web Chat React UI
CLI pip install schmitech-orbit-client
Mobile iOS & Android (Expo)
Node SDK Or use any OpenAI-compatible SDK

Deployment options

Docker Compose (fastest path)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d

Starts ORBIT + Ollama with SmolLM2, auto-pulls models, and exposes the API on port 3000. The web admin UI is at /admin on the same host. Connect orbitchat from your host:

ORBIT_ADAPTER_KEYS='{"simple-chat":"default-key"}' npx orbitchat

See the full Docker Guide for GPU mode, volumes, and configuration.

Pre-built image (server only)
docker pull schmitech/orbit:basic
docker run -d --name orbit-basic -p 3000:3000 schmitech/orbit:basic

If Ollama runs on your host, add -e OLLAMA_HOST=host.docker.internal:11434 so the container can reach it. Includes simple-chat only.

From release tarball (production)
curl -L https://github.com/schmitech/orbit/releases/download/v2.6.4/orbit-2.6.4.tar.gz -o orbit-2.6.4.tar.gz
tar -xzf orbit-2.6.4.tar.gz && cd orbit-2.6.4

cp env.example .env && ./install/setup.sh
source venv/bin/activate
./bin/orbit.sh start && cat ./logs/orbit.log

Resources


Contributing

Contributions are welcome! Check the issues for good first tasks, or open a new one to discuss your idea.

If you find ORBIT useful, a star helps others discover the project.


License

Apache 2.0 — see LICENSE.