Dig through any documentation with AI
Docmole is an MCP server that lets you query any documentation site from AI assistants like Claude, Cursor, or any MCP-compatible client. The mole digs through docs so you don't have to.
Features
- π Universal docs support β works with any documentation site
- π Self-hosted RAG β LanceDB vectors + OpenAI embeddings, no Python needed
- β‘ Zero-setup mode β instant access to Mintlify-powered sites
- π§ Multi-turn conversations β remembers context across questions
- π WebFetch compatible β links converted to absolute URLs
- π MCP native β works with Claude, Cursor, and any MCP client
Coming soon
- π¦ Ollama support β fully local mode, no API keys needed
- π Generic HTML extraction β support for non-Mintlify documentation sites
- π Incremental updates β only re-index changed pages
Installation
To use Docmole, run it directly with bunx (no install needed):
Or install globally:
Works on macOS, Linux and Windows. Requires Bun runtime.
Getting started
Local RAG Mode (any docs site)
Index and query any documentation site. Requires OPENAI_API_KEY.
# One-time setup β discovers pages and builds vector index bunx docmole setup --url https://docs.example.com --id my-docs # Start the MCP server bunx docmole serve --project my-docs
Add to your MCP client:
{
"mcpServers": {
"my-docs": {
"command": "bunx",
"args": ["docmole", "serve", "--project", "my-docs"]
}
}
}Mintlify Mode (zero setup)
For sites with Mintlify AI Assistant β no API key needed:
{
"mcpServers": {
"agno-docs": {
"command": "bunx",
"args": ["docmole", "-p", "agno-v2"]
}
}
}CLI
Docmole has a built-in CLI for all operations:
# Mintlify mode (proxy to Mintlify API) docmole -p <project-id> # Local RAG mode docmole setup --url <docs-url> --id <project-id> docmole serve --project <project-id> docmole list docmole stop --project <project-id>
Run docmole --help for all options.
How it works
βββββββββββββββ βββββββββββββββ ββββββββββββββββββββββββ
β MCP Client ββββββΆβ Docmole ββββββΆβ Embedded: LanceDB β
β (Claude, βββββββ MCP Server βββββββ Mintlify: API proxy β
β Cursor...) β βββββββββββββββ ββββββββββββββββββββββββ
βββββββββββββββ
Local RAG Mode: Crawls documentation, generates embeddings with OpenAI, stores in LanceDB. Hybrid search combines semantic and keyword matching.
Mintlify Mode: Proxies requests to Mintlify's AI Assistant API. Zero setup, instant results.
Known Mintlify Project IDs
| Documentation | Project ID |
|---|---|
| Agno | agno-v2 |
| Resend | resend |
| Mintlify | mintlify |
| Vercel | vercel |
| Upstash | upstash |
| Plain | plain |
Find more: Open DevTools β Network tab β use the AI assistant β look for
leaves.mintlify.com/api/assistant/{project-id}/message
Configuration
| Environment Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY |
β | Required for local RAG mode |
DOCMOLE_DATA_DIR |
~/.docmole |
Data directory for projects |
Project structure
~/.docmole/
βββ projects/
β βββ <project-id>/
β βββ config.yaml # Project configuration
β βββ lancedb/ # Vector database
βββ global.yaml # Global settings
Documentation
See AGENT.md for detailed documentation including:
- Architecture details
- Backend implementations
- Enterprise deployment guides
Contributing
PRs welcome! See the contributing guide for details.
Acknowledgments
- Mintlify for amazing documentation tooling
- Anthropic for Claude and the MCP protocol
- LanceDB for the vector database
License
The Docmole codebase is under MIT license.