Release Notes 3.16.0

Date: 19 December, 2024

This section consists of the new features and enhancements introduced in this release.

Data Reliability

  • Crawler Improvements: Improved crawler reliability and reduced crawling time for large datasets, ensuring faster completion updates and seamless re-runs.
  • Revamped Manage Policies Page: The updated Manage Policies page now includes sortable insights like quality score, execution status, mode, records processed, and duration of the latest run, along with total executions, 30-day average score, and open alerts count for richer insights. For more information, see Manage Policies.

Pipeline

  • Pipeline UI Improvements: Pipeline interfaces now load faster and more efficiently, due to splitting and optimizing the /details API calls. With these enhancements, users can navigate pipelines more smoothly, quickly configure their runs, and gain the insights they need to make better, faster decisions.

Compute

  • Compute UI/UX Simplification: The Compute module’s navigation and data presentation have been streamlined. The left panel now collapses into a cleaner structure, grouping key Snowflake functions together, while data sources are selectable via an intuitive dropdown. Additional improvements include simplified Admin and Performance views, enhanced guard rails visibility, and refined Warehouse Utilization metrics. These changes make it easier for users to find critical information and quickly navigate Compute functionality. For more information, see About Compute.

This section outlines the issues that have been resolved in this release.

Data Reliability

  • Resolved the issue causing a NullPointerException during Databricks partitioned table crawling, which prevented column metadata from being captured.
  • Fixed the issue causing job concurrency to malfunction when jobs fail or are deleted in the backend, leading to blocked queues.
  • Fixed the issue causing alerts to be received above the success threshold even when 'Notify on Success' is off.
  • Resolved the issue preventing the relationships of Power BI reports from loading on the asset relationships page.
  • Fixed the issue causing jobs listed as 'Running' to remain in a 'Queued' state indefinitely, with no updates.
  • Fixed the issue preventing the execution of a second policy on the Policies page without refreshing the page.
  • Fixed the issue causing the Enumeration check to fail for Boolean datatype columns due to a data type mismatch.
  • Fixed the issue causing bad records to be visible only for configured rule columns, even when all columns were selected with Spark SQL filters enabled during policy creation.

Compute

  • Resolved an issue where the Cost Optimization view displayed outdated contract details instead of the updated information.
  • Fixed the issue causing abnormally high warehouse cost due to frequent query executions.

This section consists of known limitations we are aware of persisting in this release.

  • Within the Freshness Trend chart found in the Data Cadence tab, the upper bound, lower bound, and change in row count metrics are exclusively accessible for time filters such as today, yesterday, and those based on an hourly time frame.
  • Anomaly detection is not supported for nested columns of an asset.
  • When the refresh token time expires for ServiceNow OAuth Integration, the status remains as Configured instead of changing to Expired.
  • Dataproc is only supported on GCP as the cloud provider for data plane installation.
  • Glossary window fails to load data due to Dataproc APIs failure.
  • The Smart tag feature is currently available only for Kubernetes-driven Spark deployments.
  • When a 2FA credential is activated, the credentials do not work.
  • User-specific usage data for a specific day may not be accurately displayed or retrieved.
  • Issue with GCP test connections.
  • Profiling in BigQuery fails for tables when the Partition filter is enabled.
  • DQ Policy fails sometimes due to an inability to verify a space column with special characters.
  • Unable to pick a reference asset on a Data Policy template when defining a Lookup Rule.
  • The data lineage view for job runs is not visible on Databricks' Job Run Details page.
  • If all values are null for primitive string and complex data types, profiling will occur but data type will be Unknown.
  • All failed events are not captured in the Audit Logs page.

This section consists of important links to get started with ADOC.

Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard