Release Notes 3.15.0

Date: 02 December, 2024

This section consists of the new features and enhancements introduced in this release.

Data Reliability

  • Databricks SQL Engine Pushdown Support: Databricks data sources now support the Pushdown engine for data quality checks, data profiling, and SQL operations. Users can switch between the Pushdown and Spark Engines at any time, ensuring consistent results across both engines. For more information, see Databricks.
  • Data Cadence Support for Kafka Data Sources: ADOC now supports hourly scheduled jobs for Kafka data sources, capturing key metrics such as the number of messages, last message update time, and hourly message rates. The Data Freshness policy now runs automatically every hour and requires DBA privileges to execute. For more information, see Cadence Metrics and Freshness Policies.
  • UDT Usage Impact Visibility: This enhancement provides users with a comprehensive view of all locations where a User-Defined Type (UDT) is currently in use. Before making edits or deleting a UDT, users can review its usage to better understand and mitigate potential impacts. For more information, see User Defined Templates.

Compute

  • CSV Export for Recommendations View: A new CSV export option has been added for filtered recommendations, enhancing data analysis and sharing capabilities. For more information, see Recommendations.
  • Accurate Cost Calculation with Service-Level Rates: Using a single Cost Per Credit (CPC) for all Snowflake services could lead to inaccurate cost attribution when different rates apply. To address this, the system now retrieves service-specific rates directly from Snowflake metadata, ensuring more precise calculations. The system uses the user-provided CPC as a fallback if service-specific rates are not available. For more information, see Snowflake.
  • Compute Widgetization for Customizable Dashboards: Additional widgets have been introduced to the Select Widgets section, allowing you to further customize your dashboards. New options include widgets from Snowflake Config, User Adoption, Housekeeping, and more, enabling enhanced flexibility and insights.

User Experience

  • Enhanced Filtering for Data Reliability Widgets: Users can now add and apply filters—such as tag, policy type, and last result status—to Data Reliability widgets when creating a new dashboard. This enhancement provides greater customization and focus for dashboard insights. For more information, see the Dashboard.

This section outlines the issues that have been resolved in this release.

Data Reliability

  • Resolved the problem where the Performance Trend widget failed to hold onto the chosen metric during dashboard saving. Each metric now displays as a separate chart and saves by default.
  • Fixed a problem where the edit policy tab's display of notification channels after policy import did not correspond to the actual selection. Now, the target environment will import only notification channels with the same name and channel type as those in the source environment when importing a policy.
  • Resolved a problem where the profile scheduling feature in the Profile settings failed to activate at the designated time.
  • Resolved a problem where SQL views failed to function when user used Pushdown as the data plane engine.
  • Fixed an issue that prevented the retrieval of the '_SYS_BIC' schema during the addition of an SAP HANA data source, guaranteeing the inclusion of all necessary schemas for data profiling and quality checks.
  • Resolved the issue where the DQ policies' lookup SQL filter was unable to apply filters to the underlying reference asset.
  • Fixed the issue where users were given a false alert about data plane role permissions while generating a DQ policy, even though the functionality was working properly.

Compute

  • Resolved an issue where the email notification channel failed to handle multiple email addresses, causing errors when adding email addresses to the channel.
  • Fixed an issue where alert queries caused a database CPU spike by optimizing query performance, reducing its impact.
  • Resolved an issue where the attributed query cost calculation overlooked a few query patches.
  • Fixed an issue where the DLT pipeline status was not being updated on the cluster listing page; it is now being updated as expected.
  • Resolved an issue where the driver stats for Databricks job run details for DLT pipelines were broken; all the charts now load correctly.

This section consists of known limitations we are aware of persisting in this release.

  • Within the Freshness Trend chart found in the Data Cadence tab, the upper bound, lower bound, and change in row count metrics are exclusively accessible for time filters such as today, yesterday, and those based on an hourly time frame.
  • Anomaly detection is not supported for nested columns of an asset.
  • When the refresh token time expires for ServiceNow OAuth Integration, the status remains as Configured instead of changing to Expired.
  • Dataproc is only supported on GCP as the cloud provider for data plane installation.
  • Glossary window fails to load data due to Dataproc APIs failure.
  • The Smart tag feature is currently available only for Kubernetes-driven Spark deployments.
  • When a 2FA credential is activated, the credentials do not work.
  • User-specific usage data for a specific day may not be accurately displayed or retrieved.
  • Issue with GCP test connections.
  • Profiling in BigQuery fails for tables when the Partition filter is enabled.
  • DQ Policy fails sometimes due to an inability to verify a space column with special characters.
  • Unable to pick a reference asset on a Data Policy template when defining a Lookup Rule.
  • The data lineage view for job runs is not visible on Databricks' Job Run Details page.
  • If all values are null for primitive string and complex data types, profiling will occur but data type will be Unknown.
  • All failed events are not captured in the Audit Logs page.

This section consists of important links to get started with ADOC.

Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard