
CEO & Co-founder of Visivo
dbt + BI-as-Code Integration Guide for Scalable BI
Learn how to integrate dbt with BI-as-Code for a scalable, maintainable analytics stack that grows with your organization.

The combination of dbt™ for data transformation and BI-as-Code for visualization creates a powerful, scalable analytics stack. According to the dbt Labs State of Analytics Engineering 2023, 67% of organizations now use dbt™ for data transformation. When these tools work together seamlessly, organizations achieve true analytics engineering—where data pipelines and dashboards evolve together as one cohesive system.
How dbt and BI-as-Code Complement Each Other
dbt™ handles data transformation within your warehouse, creating clean, tested, documented data models. Learn more at docs.getdbt.com. BI-as-Code takes these models and creates visualizations through code-based configurations. Together, they form a complete analytics pipeline from raw data to polished dashboards.
The synergy is powerful:
- dbt™ ensures data quality and consistency
- BI-as-Code ensures visualization consistency and version control
- Both use Git for collaboration (see git-scm.com)
- Both support CI/CD workflows (see our CI/CD analytics implementation guide)
- Both enable testing and documentation
A McKinsey study reveals that data-driven organizations are 23 times more likely to acquire customers, 6 times as likely to retain them, and 19 times as likely to be profitable - this integrated approach enables that comprehensive data-driven success.
Integration Strategies
Shared Repository Structure:
analytics-platform/
├── dbt/
│ ├── models/
│ │ ├── staging/
│ │ └── marts/
│ ├── tests/
│ └── macros/
├── dashboards/
│ ├── executive/
│ ├── operational/
│ └── components/
├── .github/
│ └── workflows/
│ └── integrated-pipeline.yml
└── README.md
Synchronized Deployment:
# integrated-pipeline.yml
name: Analytics Platform Deploy
on:
push:
branches: [main]
jobs:
transform:
name: Run dbt transformations
steps:
- name: dbt run
run: |
dbt deps
dbt run
dbt test
- name: Generate manifest
run: |
dbt docs generate
cp target/manifest.json ../dashboards/
visualize:
name: Update dashboards
needs: transform
steps:
- name: Sync dbt metadata
run: |
visivo sync --dbt-manifest manifest.json
- name: Deploy dashboards
run: |
visivo deploy --production
- name: Validate integration
run: |
visivo test --check-dbt-refs
Automated Dependency Management:
# sync-dependencies.py
import json
import yaml
def sync_dbt_to_bi():
"""Ensure BI dashboards reference valid dbt models"""
# Load dbt manifest
with open('dbt/target/manifest.json') as f:
dbt_models = json.load(f)['nodes']
# Load dashboard configs
dashboards = yaml.load_all('dashboards/*.yml')
for dashboard in dashboards:
for chart in dashboard['charts']:
source = chart['model']
# Validate source exists in dbt
if source not in dbt_models:
raise ValueError(f"Dashboard references non-existent model: {source}")
# Update schema from dbt
chart['columns'] = dbt_models[source]['columns']
Example Workflow
Complete Integration Example:
name: example_project
# BI dashboard referencing dbt model
# dashboards/executive.yml
name: Executive Dashboard
sources:
- name: warehouse
type: snowflake
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
database: ANALYTICS
warehouse: COMPUTE_WH
db_schema: PUBLIC
charts:
- name: revenue_trend
source: revenue_data
type: line
x: month
y: total_revenue
# Metadata from dbt
description: "{{ dbt.get_model_description('revenue_summary') }}"
freshness: "{{ dbt.get_model_freshness('revenue_summary') }}"
Testing Integration:
# Integrated tests
tests:
data_quality:
- dbt_test: unique_customer_id
- dbt_test: revenue_positive
dashboard_accuracy:
- name: totals_match
assertion: |
dashboard.total_revenue ==
dbt.run_query("SELECT SUM(total_revenue) FROM revenue_summary")
integration:
- name: all_dashboard_sources_exist
check: every_dashboard_source_in_dbt_catalog
- name: column_references_valid
check: all_column_refs_exist_in_dbt_schema
Achieving Scalability
The integrated approach scales elegantly:
Modular Growth: Add new data sources to dbt, automatically available in BI.
Team Scalability: Data engineers work on dbt, analysts on dashboards, with clear interfaces.
Performance Scalability: dbt handles heavy transformations, BI tools just visualize.
Maintenance Scalability: Changes in one place propagate automatically.
Organizations with integrated dbt + BI-as-Code report:
- Eliminated metric mismatches
- Faster end-to-end development
- Consistent definitions
- Improved scalability
The integration of dbt and BI-as-Code isn't just a technical improvement—it's a fundamental shift toward treating analytics as a first-class engineering discipline. This combination provides the foundation for analytics that scales with your organization's ambitions.