Snowflake Compute

Connecting Snowflake to ADOC helps you monitor and optimize the performance and cost of your Snowflake compute resources. Once connected, ADOC can:

  • Track credit consumption to improve cost efficiency.
  • Enable automated alerts and diagnostics for query performance and compute utilization.

Prerequisites

Before connecting Snowflake to ADOC, you need to set up Snowflake so ADOC can securely monitor your data for quality, performance, and cost. This setup ensures the right users and roles exist, and that they have the permissions needed to access only what’s necessary.

Note All steps in this section must be executed from the Snowflake Web UI or a SQL editor (like SnowSQL or a Snowflake IDE) using an account with ACCOUNTADMIN privileges.

Users and Roles Overview

Before we start, here’s a quick overview of the users and roles that will be created:

Role/UserPurpose
AD_COMPUTE_MONITORRole that grants access to monitor Snowflake credit and performance
AD_USERAuthenticates into ADOC for compute observability and cost tracking

To set up your Snowflake data source for compute monitoring, you must do the following:

SQL
Copy

1. Define Variables

Start by defining variables for reuse across the setup steps. This makes the script portable and easier to manage or rerun later.

2. Create Role and Warehouse

Ensure you’re using the ACCOUNTADMIN role for full privileges:

  1. Create a dedicated warehouse: This ensures low-cost, temporary compute for ADOC to run metadata queries. This warehouse is used for lightweight metadata queries by ADOC. It isolates observability compute from your production workloads, helping you manage cost and reduce risk.
  2. Create a custom monitoring role: This role is scoped specifically to monitor performance and usage. It ensures least-privileged access and easier audit tracking.

3. Grant Required Priveleges

  1. Grant access to Snowflake’s Account Usage schema: Required to access views like QUERY_HISTORY, and __WAREHOUSE_METERING_HISTORY, etc.
  2. Enable monitoring of Snowflake resources: Allows access to credit usage, performance stats, query behavior, and warehouse efficiency.
  3. Allow role to use the monitoring warehouse

4. Create User for Monitoring

Create a new user that ADOC will use to query Snowflake metadata. This ensures the user automatically starts with the right role and namespace. If reusing an existing user, update defaults.

  1. Assign role to user

5. Create Monitoring Database and Grant Schema Access

  1. Create the monitoring database
  2. Grant usage on database
  3. Switch to the new database
  4. Grant schema-level privileges: These permissions are needed to manage internal stages, formats, and collect metadata.

6. Use Store Procedure to Grant Monitro Priveleges on All Warehouses

This script ensures the monitoring role has access to all current and future warehouses and resource monitors.

  1. Create the procedure
  2. Grant usage on the procedure
  3. Run the procedure

Optional Steps

Optional 1: Allow storage integrations

Required if ADOC will use external stages (like S3 or Azure Blob) through secure integrations.

Optional 2: Enable ORGADMIN Role

Allows access to ORGANIZATION_USAGE schema for consolidated org-wide monitoring.

Optional 3: Troubleshooting & FAQ

Stage Creation Failure

OAuth reauthentication: If default roles/namespaces aren’t set, the user may need to re-authenticate via OAuth.

Changing role for existing users

Add Snowflake as a Data Source

Step 1: Start Setup

  1. In the ADOC UI, click Register from the left menu.

  2. Click Add Data Source.

  3. Select Snowflake.

  4. On the Basic Details page:

    1. Enter a name for this data source.
    2. (Optional) Add a description.
    3. Choose your Data Plane or click Setup Data Plane to create one.
    4. You must enable at least one or both of the following to continue:
      • Compute Observability
      • Data Reliability Monitoring
  5. Click Next.

Step 2: Add Connection Details

  1. Enter the Snowflake URL (e.g., https://<account>.snowflakecomputing.com)

  2. Provide the following Snowflake credentials:

    1. Username
    2. Password
    3. Role (e.g., AD_COMPUTE_MONITOR)
  3. Select your Data Plane Engine:

    • Spark (for external compute)
    • Pushdown (for in-Snowflake processing)

Note Pushdown is more cost-effective, while Spark provides more control over compute.

  1. (Optional) Enable OAuth: If using OAuth, toggle Enable OAuth and provide:

    1. Authorization Endpoint
    2. Token Endpoint
    3. Client ID / Client Secret
    4. (Optional) Enable PKCE
  2. (Optional for Pushdown): Configure Global Storage. If Pushdown is selected, you can optionally toggle Configure Global Storage in Snowflake and enter:

    1. Stage Name
    2. Stage File Format (e.g., PARQUET, CSV)
  3. Click Test Connection. If the connection is successful, you'll see a “Connected” message. If not, check the credentials and try again.

  4. Click Next.

Step 3: Set Up Observability

If you enabled Compute Observability, fill in the Compute Observability section:

FieldWhat to Enter
WarehouseSelect one or more Snowflake warehouses
DatabaseName of the monitoring database (default: AD_MONITOR_DB)
Cost per CreditYour cost per Snowflake credit (e.g., 2.5)
Query Cost Type

Choose:

  • Acceldata Attributed Query Cost
  • Snowflake Attributed Query Cost
Snowflake Fetch Past DataChoose data range to backfill (15 days – 1 year)
Polling ScheduleSet time and timezone for scheduled polling
Configure External Stage (Optional)Enable this if using Snowflake-managed external storage

Why this matters: These settings allow ADOC to compute credit usage and query-level metrics from Snowflake, giving you visibility into workload costs and efficiency.

If you enabled Data Reliability, fill in the Data Reliability section:

FieldWhat to Enter
WarehouseSelect Snowflake warehouse(s) used for crawling
DatabasesChoose all applicable databases to monitor
Enable Query AnalysisTurn ON to analyze how datasets are queried
Enable Crawler ExecutionSchedule crawlers to check data quality
Timezone & ScheduleChoose when crawlers should run (daily, weekly, etc.)

Why this matters: These options allow ADOC to scan your Snowflake datasets for freshness, schema drift, null values, and other data quality issues.

Click Submit to complete the setup.

You’ll see a new Snowflake card on the Data Sources page with connection and crawler details.

Optimizing Data Partitioning

To tune performance for large datasets, you can adjust Snowflake's default parallelism using this environment variable in your ADOC Data Plane configuration:

Bash
Copy
FieldDescription
Default Snowflake Partition100 MB
ADOC Default2000 MB
Use CaseLower the value to increase parallelism for large datasets

Note Smaller partitions = higher concurrency = faster data processing

You can connect Snowflake to ADOC securely over AWS PrivateLink for improved network isolation.

Prerequisites:

  • AWS account with necessary permissions
  • VPC in us-west-2
  • Snowflake account ready

Why Use PrivateLink?

AWS PrivateLink allows your Snowflake data to connect with ADOC services over a secure, private network path—without traversing the public internet. This enhances data security, reduces latency, and improves performance.

Share your AWS account ID with the Acceldata support team. Acceldata will use this ID to authorize your account for PrivateLink connectivity.

Note This is a one-time setup per AWS account.

Step 2: Create VPC Endpoints

In the AWS Management Console:

  1. Navigate to VPC.
  2. In the navigation pane, choose Endpoints.
  3. Select Create Endpoint.
  4. Create the following two endpoints:
Service NameEndpoint
ADOC Control Planecom.amazonaws.vpce.us-west-2.vpce-svc-091c001843d33bbaa
Secure Relaycom.amazonaws.vpce.us-west-2.vpce-svc-02830f09899d40f01

Note Make sure the VPC region is set to us-west-2.

Step 3: Configure DNS Using Route 53

In Amazon Route 53:

  1. Navigate to the Hosted Zones section for your domain.
  2. Add the following A records:
Record NameTypeValue
<tenant>.acceldata.appAIP address of the ADOC Control Plane VPC endpoint. Replace <tenant> with your tenant subdomain.
dataplane.acceldata.appAIP address of the Secure Relay VPC endpoint. Use the IP address assigned to each endpoint in your VPC.

Note These DNS records ensure your traffic is routed directly to the ADOC services via PrivateLink. You’ll need to replace the placeholder values with your actual VPC endpoint IPs.

Security Tip Use least-privileged IAM roles when creating and attaching these endpoints.

Troubleshooting

IssuePossible CauseResolution
Stage creation failsRole doesn’t have ownership privilegesLog in as ACCOUNTADMIN, check the monitoring database, run SHOW STAGES, and use GRANT OWNERSHIP to fix role access
OAuth failsUser role/namespace not setEnsure the Snowflake user does not have ACCOUNTADMIN role; re-authenticate via ADOC
Connection test failsInvalid credentials, missing grants, or wrong roleDouble-check Snowflake URL, credentials, warehouse, and role permissions

What’s Next

After you’ve connected Snowflake Compute in ADOC:

  • Visit the Compute tab to view warehouse performance and query execution metrics.
  • Track credit usage trends to identify cost-saving opportunities.
  • Set up alerts for spikes in compute usage, query slowdowns, or unusual activity.
  • Explore usage patterns to optimize warehouse sizing and scheduling.
Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard