🔒 Proactive Dependency Security
As part of our commitment to supply chain integrity, we continually monitor our dependency tree against known vulnerabilities and industry advisories. In response to a recently disclosed supply chain incident affecting litellm versions 1.82.7–1.82.8, we have audited our packages and removed the litellm dependency for most usage. It is solely used in the test directory for skills evaluation and optimization, and has been pinned to a safe version.
For full third-party attribution, see NOTICE.txt.
Overview
AI-Driven Development (vibe coding) on Databricks just got a whole lot better. The AI Dev Kit gives your AI coding assistant (Claude Code, Cursor, Antigravity, Windsurf, etc.) the trusted sources it needs to build faster and smarter on Databricks.
What Can I Build?
- Spark Declarative Pipelines (streaming tables, CDC, SCD Type 2, Auto Loader)
- Databricks Jobs (scheduled workflows, multi-task DAGs)
- AI/BI Dashboards (visualizations, KPIs, analytics)
- Unity Catalog (tables, volumes, governance)
- Genie Spaces (natural language data exploration)
- Knowledge Assistants (RAG-based document Q&A)
- MLflow Experiments (evaluation, scoring, traces)
- Model Serving (deploy ML models and AI agents to endpoints)
- Databricks Apps (full-stack web applications with foundation model integration)
- ...and more
Choose Your Own Adventure
| Adventure | Best For | Start Here |
|---|---|---|
| ⭐ Install AI Dev Kit | Start here! Follow quick install instructions to add to your existing project folder | Quick Start (install) |
| Visual Builder App | Web-based UI for Databricks development | databricks-builder-app/ |
| Core Library | Building custom integrations (LangChain, OpenAI, etc.) | pip install |
| Skills Only | Provide Databricks patterns and best practices (without MCP functions) | Install skills |
| Genie Code Skills | Install skills into your workspace for Genie Code (--install-to-genie) |
Genie Code skills (install) |
| MCP Tools Only | Just executable actions (no guidance) | Register MCP server |
Quick Start
Prerequisites
- uv - Python package manager
- Databricks CLI - Command line interface for Databricks
- AI coding environment (one or more):
Install in existing project
By default this will install at a project level rather than a user level. This is often a good fit, but requires you to run your client from the exact directory that was used for the install. Note: Project configuration files can be re-used in other projects. You find these configs under .claude, .cursor, .gemini, .codex, .github or .agents
Mac / Linux
Basic installation (uses DEFAULT profile, project scope)
bash <(curl -sL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.sh)Advanced Options (click to expand)
Global installation with force reinstall
bash <(curl -sL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.sh) --global --forceSpecify profile and force reinstall
bash <(curl -sL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.sh) --profile DEFAULT --forceInstall for specific tools only
bash <(curl -sL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.sh) --tools cursor,gemini,antigravityNext steps: Respond to interactive prompts and follow the on-screen instructions.
- Note: Cursor and Copilot require updating settings manually after install.
Windows (PowerShell)
Basic installation (uses DEFAULT profile, project scope)
irm https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.ps1 | iex
Advanced Options (click to expand)
Download script first
irm https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/install.ps1 -OutFile install.ps1
Global installation with force reinstall
.\install.ps1 -Global -Force
Specify profile and force reinstall
.\install.ps1 -Profile DEFAULT -Force
Install for specific tools only
.\install.ps1 -Tools cursor,gemini,antigravity
Next steps: Respond to interactive prompts and follow the on-screen instructions.
- Note: Cursor and Copilot require updating settings manually after install.
Visual Builder App
Full-stack web application with chat UI for Databricks development. Deploys a Lakebase database and Databricks App with a single command:
cd ai-dev-kit/databricks-builder-app # Deploy everything (Lakebase + app + permissions) ./scripts/deploy.sh my-builder-app --profile <your-profile>
For local development:
./scripts/setup.sh # Install dependencies # Edit .env.local with your credentials ./scripts/start_dev.sh # Start locally at http://localhost:3000
See databricks-builder-app/ for full documentation.
Core Library
Use databricks-tools-core directly in your Python projects:
from databricks_tools_core.sql import execute_sql results = execute_sql("SELECT * FROM my_catalog.schema.table LIMIT 10")
Works with LangChain, OpenAI Agents SDK, or any Python framework. See databricks-tools-core/ for details.
Genie Code Skills
Install skills into ./.claude/skills (relative to the directory where you run the script), then upload them to your workspace at /Workspace/Users/<you>/.assistant/skills so Genie Code can use them in the UI. Requires the Databricks CLI authenticated for your workspace.
Always run from the project directory where you want .claude/skills created (for example your repo root or ai-dev-kit).
From this repo (recommended if you have a clone):
# Databricks skills from this checkout + upload (DEFAULT CLI profile) ./databricks-skills/install_skills.sh --local --install-to-genie # Download all skills from GitHub, then upload ./databricks-skills/install_skills.sh --install-to-genie # Explicit Databricks CLI profile ./databricks-skills/install_skills.sh --install-to-genie --profile YOUR_PROFILE
Without cloning (run from the directory that should contain .claude/skills):
curl -sSL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/databricks-skills/install_skills.sh | bash -s -- --install-to-genieCombine --profile, --local, specific skill names, --mlflow-version, etc. as needed; see ./databricks-skills/install_skills.sh --help or databricks-skills/README.md.
Skill modification or Custom Skill
After the script successfully installs the skills to your workspace, you may find the skills under /Workspace/Users/<your_user_name>/.assistant/skills.
This directory is customizable if you wish to only use certain skills or even create custom skills that are related to your organization to make Genie Code even better. You can modify/remove existing skills or create new skills folders that Genie Code will automatically use in any session.
What's Included
| Component | Description |
|---|---|
databricks-tools-core/ |
Python library with high-level Databricks functions |
databricks-mcp-server/ |
MCP server exposing 50+ tools for AI assistants |
databricks-skills/ |
20 markdown skills teaching Databricks patterns |
databricks-builder-app/ |
Full-stack web app with Claude Code integration |
Star History
License
(c) 2026 Databricks, Inc. All rights reserved.
The source in this project is provided subject to the Databricks License. See LICENSE.md for details.
Third-Party Licenses
| Package | Version | License | Project URL |
|---|---|---|---|
| fastmcp | ≥0.1.0 | MIT | https://github.com/jlowin/fastmcp |
| mcp | ≥1.0.0 | MIT | https://github.com/modelcontextprotocol/python-sdk |
| sqlglot | ≥20.0.0 | MIT | https://github.com/tobymao/sqlglot |
| sqlfluff | ≥3.0.0 | MIT | https://github.com/sqlfluff/sqlfluff |
| plutoprint | ==0.19.0 | MIT | https://github.com/plutoprint/plutoprint |
| claude-agent-sdk | ≥0.1.19 | MIT | https://github.com/anthropics/claude-code |
| fastapi | ≥0.115.8 | MIT | https://github.com/fastapi/fastapi |
| uvicorn | ≥0.34.0 | BSD-3-Clause | https://github.com/encode/uvicorn |
| httpx | ≥0.28.0 | BSD-3-Clause | https://github.com/encode/httpx |
| sqlalchemy | ≥2.0.41 | MIT | https://github.com/sqlalchemy/sqlalchemy |
| alembic | ≥1.16.1 | MIT | https://github.com/sqlalchemy/alembic |
| asyncpg | ≥0.30.0 | Apache-2.0 | https://github.com/MagicStack/asyncpg |
| greenlet | ≥3.0.0 | MIT | https://github.com/python-greenlet/greenlet |
| psycopg2-binary | ≥2.9.11 | LGPL-3.0 | https://github.com/psycopg/psycopg2 |
Acknowledgments
MCP Databricks Command Execution API from databricks-exec-code by Natyra Bajraktari and Henryk Borzymowski.