Databricks Semantic Layer with Bonnard
Connect Bonnard to Databricks and ship governed metrics to AI agents, dashboards, and your product. YAML semantic layer with MCP and React SDK.
Bonnard gives you a Databricks semantic layer defined in YAML, version-controlled, and queryable from AI agents, React components, and REST APIs. If your team runs Unity Catalog, Delta Lake tables, or Databricks SQL warehouses, Bonnard connects directly and exposes governed metrics through MCP, React SDK, REST API, and markdown dashboards.
How does Bonnard connect to Databricks?
Add Databricks as a datasource in your Bonnard project. Define the connection in your datasources.yml:
# datasources.yml
datasources:
- name: databricks_warehouse
type: databricks
workspace_url: https://your-workspace.cloud.databricks.com
token: ${DATABRICKS_TOKEN}
catalog: main
schema: analytics
http_path: /sql/1.0/warehouses/abc123
Then run:
bon datasource add databricks_warehouse
bon deploy
Bonnard authenticates with your Databricks workspace, introspects your catalog and schema, and deploys your semantic layer. Works with both Databricks SQL warehouses and all-purpose clusters. Your lakehouse data is queryable through every Bonnard surface within minutes.
What do you get?
Once connected, your Databricks data is available through four surfaces:
MCP server. Run bon mcp and your AI agents (Claude, ChatGPT, Cursor) query governed Databricks metrics with row-level security. Generate publishable keys per tenant for customer-facing agentic analytics.
React SDK. Drop BarChart, LineChart, and BigValue components into your product. Every chart queries your Databricks lakehouse through the semantic layer with multi-tenant access controls applied automatically.
REST API. Query metrics programmatically from any language or platform. Type-safe queries with the TypeScript SDK or raw HTTP from anywhere.
Markdown dashboards. Author dashboards in markdown, deploy with bon deploy, and share governed views with your team or customers.
How does Bonnard compare to native Databricks analytics?
| Capability | Databricks native | Bonnard + Databricks |
|---|---|---|
| Metric definitions | Unity Catalog metrics (preview) | YAML semantic layer (version-controlled) |
| AI agent access | Genie / AI functions | MCP server with publishable keys |
| Embedded analytics | Databricks SQL dashboards | React SDK with multi-tenant auth |
| Dashboards | Databricks SQL dashboards | Markdown dashboards, deployed via CLI |
| Multi-tenancy | Unity Catalog + workspace isolation | Publishable keys + row-level security |
| Pre-aggregation | Materialized views (Delta) | Automatic pre-aggregation cache |
| dbt integration | dbt-databricks adapter | bon datasource add --from-dbt imports models |
| Access control | Unity Catalog RBAC | YAML-defined RBAC + audit logging |
| Deployment | Workspace notebooks | bon deploy (no restart, no SSH) |
FAQ
Does Bonnard support Databricks?
Yes. Databricks is a first-class Bonnard datasource. Works with Databricks on AWS, Azure, and GCP. Configure your workspace URL, token, and SQL warehouse path, then deploy.
Does Bonnard work with Unity Catalog?
Yes. Specify your catalog and schema in the datasource config. Bonnard introspects Unity Catalog tables and views. You define cubes and measures in YAML on top of your existing catalog structure.
Can I use pre-aggregations with Databricks?
Yes. The pre-aggregation cache handles this automatically. Define rollups in your cube YAML files and Bonnard builds and refreshes them on schedule. This reduces Databricks SQL warehouse compute and speeds up repeated queries.
Can I import dbt models from Databricks?
Yes. Run bon datasource add --from-dbt pointed at your dbt project using the dbt-databricks adapter. Bonnard imports your models as cubes and your metrics as measures. Layer the semantic layer on top without rewriting transformations.
Connect Databricks. Ship governed analytics.
Define your metrics in YAML, connect to Databricks, and expose governed analytics through MCP, React SDK, and REST API.