Title
Create new category
Edit page index title
Edit category
Edit link
dbt Cloud
dbt (data build tool) is a widely used transformation framework that enables data teams to define, execute, and test SQL-based transformations in their data warehouses. dbt Cloud is the managed SaaS offering from dbt Labs, providing a hosted environment for scheduling and monitoring dbt projects.
The dbt Cloud connector in Acceldata Data Observability Cloud (ADOC) enables you to onboard dbt Cloud as a data source and monitor your dbt workloads as pipelines within ADOC. Once configured, ADOC automatically discovers dbt Cloud jobs, tracks executions, and provides visibility into transformation lineage, resource-level execution details, and test results — without requiring manual instrumentation.
Why Monitor dbt Cloud in ADOC
dbt transformations are a critical step in the data lifecycle. Failures, schema mismatches, or test violations in dbt jobs can silently degrade the quality of downstream assets, dashboards, and reports. Without centralized observability, detecting and diagnosing these issues requires manual intervention across multiple tools.
By integrating dbt Cloud with ADOC, you can:
- Monitor all dbt job executions as pipeline runs in a single interface
- View end-to-end lineage across models, snapshots, seeds, sources, and tests
- Inspect the compiled SQL query executed for each resource during a run
- Track test results — including failures — as part of the pipeline execution graph
- Filter and navigate dbt Cloud pipelines by data source alongside other integrated platforms
How dbt Cloud Appears in ADOC
ADOC maps dbt Cloud concepts to its pipeline model as follows:
| dbt Cloud | ADOC |
|---|---|
| Job | Pipeline |
| Job execution (Run) | Pipeline Run |
| Model | Job node → Asset node (view or table) |
| Snapshot | Job node → Asset node (snapshot table) |
| Seed | Job node → Asset node (warehouse table) |
| Test | Job node (evaluation, no asset created) |
| Source | Job node (used for freshness measurement) |
Each job in dbt Cloud is represented as a single pipeline in ADOC. Every execution of that job is tracked as a single pipeline run, regardless of the number of dbt commands involved. This provides a consistent, unified view of each job's execution history and lineage.
Prerequisites
The dbt Cloud integration is generally available from ADOC Release 26.4.0.
Before you configure the dbt Cloud connector, make sure you have the following:
- A dbt Cloud account with one or more active projects
- A dbt Cloud Service Token with Job Viewer permissions
- The dbt Cloud URL, Account ID, and Auth Token
- An ADOC Data Plane running version 4.5.0 or later
To generate a Service Token:
- In dbt Cloud, navigate to Account Settings.
- Select API Tokens → Service Tokens.
- Create a new token and assign it Job Viewer permissions.
- Copy the token value for use during ADOC configuration.
Configuration Settings
The dbt Cloud connector uses the following environment variables to control how often it polls dbt Cloud and how much run history it retrieves. These settings apply to all tenants on the platform. To change a value, contact your ADOC administrator, because the update requires a deployment change.
| Setting | Default | Description |
|---|---|---|
PIPELINE_HEARTBEAT_DBT_CLOUD_INTERVAL | 30 minutes | Sets how often ADOC polls dbt Cloud for new job run data. |
DBT_CLOUD_PIPELINE_RUNS_LOOKBACK_HOURS | 24 hours | Defines the time window used to retrieve job run history on the first pull. Later pulls use the last polled time as the starting point. |
Setting Up a dbt Cloud Project and Job
If you don’t already have a dbt Cloud project and job to monitor, complete these steps in dbt Cloud before you configure the connector in ADOC.
Create a dbt Cloud project
- On the dbt Cloud dashboard, click New Project.
- Enter a project name and configure your development environment.
- Select your data warehouse (for example, Snowflake) and enter the warehouse credentials.
- Connect a Git repository. You can let dbt Cloud create a managed repository for you.
- Set up a deployment environment that uses the same warehouse connection.
Create a dbt Cloud job
- In dbt Cloud, open your project and select Jobs in the left navigation.
- Click Create Job and enter a descriptive job name.
- Select the deployment environment and the dbt version to use.
- Add the commands to run, such as dbt run, dbt test, and dbt docs generate.
- Save the job. To verify that it works, open the job details page and click Run Now.
Setting Up dbt Cloud as a Data Source in ADOC
The dbt Cloud connector uses a pull-based mechanism to regularly retrieve job metadata and execution history from your dbt Cloud account.
To configure the connector:
In ADOC, navigate to the left main navigation menu, click Control Center -> Integrations -> Add Data Source.
Select dbt Cloud from the list of data sources.
Provide a name and description for the data source.
Ensure the Data Reliability toggle is enabled and select a Data Plane.
Enter the following dbt connection details:
- dbt Cloud URI
- Account ID
- Auth token
Click Test Connection. If the connection is successful, click Next else review the entered connection details.
On the Set Up Observability page, select one or more dbt Cloud projects to monitor and click Submit.
After the data source is created, ADOC automatically begins pulling metadata from the selected dbt Cloud projects.
Viewing dbt Cloud Pipelines in ADOC
To view dbt Cloud pipeline data, navigate to the Pipelines page. You can filter by the dbt Cloud data source to display only dbt-related pipelines.
Each pipeline corresponds to a dbt Cloud job. Selecting a pipeline displays its execution history as a list of pipeline runs. Within a run, you can:
- View the execution timeline across all resource nodes in the job
- Inspect the compiled SQL query for each model, snapshot, seed, or test node
- Review test results, including failure counts and the SQL query that produced the failures
- Navigate lineage to understand upstream inputs and downstream outputs of each resource
Known Limitations
Keep the following limitations in mind when you monitor dbt Cloud in ADOC:
- Pipeline runs are not real time. A dbt Cloud job run appears in ADOC only after the run finishes, because ADOC builds the run from the artifacts that dbt Cloud publishes after each run step.
- Asset correlation is supported only for Snowflake. For dbt resources that read from or write to other warehouses, ADOC still tracks the run, but it doesn’t link the resources to ADOC assets.
- The integration is execution based. Only the models, snapshots, seeds, tests, and sources that actually run as part of a dbt Cloud job appear in the ADOC pipeline run. Resources that are defined in the project but not executed are not shown.
- Only projects explicitly selected during data source setup are monitored. To add or remove projects, edit the data source configuration.
- If a job is deleted in dbt Cloud, its corresponding pipeline and historical run data are retained in ADOC.
- Test nodes in the pipeline graph represent validation checks and do not produce output assets.
- Source nodes represent freshness measurement inputs and reflect tables already present in the warehouse.
For additional help, contact www.acceldata.force.com OR call our service desk +1 844 9433282
Copyright © 2025