You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/managed-datahub/observe/column-assertions.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ import FeatureAvailability from '@site/src/components/FeatureAvailability';
18
18
19
19
Can you remember a time when an important warehouse table column changed dramatically, with little or no notice? Perhaps the number of null values suddenly spiked, or a new value was added to a fixed set of possible values. If the answer is yes, how did you initially find out? We'll take a guess - someone looking at an internal reporting dashboard or worse, a user using your your product, sounded an alarm when a number looked a bit out of the ordinary.
20
20
21
-
There are many reasons why important columns in your Snowflake, Redshift, or BigQuery tables may change - application code bugs, new feature rollouts, etc. Oftentimes, these changes break important assumptions made about the data used in building key downstream data products like reporting dashboards or data-driven product features.
21
+
There are many reasons why important columns in your Snowflake, Redshift, BigQuery, or Databricks tables may change - application code bugs, new feature rollouts, etc. Oftentimes, these changes break important assumptions made about the data used in building key downstream data products like reporting dashboards or data-driven product features.
22
22
23
23
What if you could reduce the time to detect these incidents, so that the people responsible for the data were made aware of data issues before anyone else? With Acryl DataHub Column Assertions, you can.
24
24
@@ -41,7 +41,7 @@ Note that an Ingestion Source _must_ be configured with the data platform of you
41
41
Acryl DataHub's **Ingestion** tab.
42
42
43
43
> Note that Column Assertions are not yet supported if you are connecting to your warehouse
44
-
> using the DataHub CLI or a Remote Ingestion Executor.
44
+
> using the DataHub CLI.
45
45
46
46
## What is a Column Assertion?
47
47
@@ -121,7 +121,7 @@ another always-increasing number - that can be used to find the "new rows" that
121
121
`Edit Assertions` and `Edit Monitors` privileges for the entity. This is granted to Entity owners by default.
122
122
123
123
2.**Data Platform Connection**: In order to create a Column Assertion, you'll need to have an **Ingestion Source**
124
-
configured to your Data Platform: Snowflake, BigQuery, or Redshift under the **Ingestion** tab.
124
+
configured to your Data Platform: Snowflake, BigQuery, Redshift, or Databricks under the **Ingestion** tab.
125
125
126
126
Once these are in place, you're ready to create your Column Assertions!
Copy file name to clipboardExpand all lines: docs/managed-datahub/observe/custom-sql-assertions.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ If the answer is yes, how did you find out? We'll take a guess - someone looking
22
22
a number looked a bit out of the ordinary. Perhaps your table initially tracked purchases made on your company's e-commerce web store, but suddenly began to include purchases made
23
23
through your company's new mobile app.
24
24
25
-
There are many reasons why an important Table on Snowflake, Redshift, or BigQuery may change in its meaning - application code bugs, new feature rollouts,
25
+
There are many reasons why an important Table on Snowflake, Redshift, BigQuery, or Databricks may change in its meaning - application code bugs, new feature rollouts,
26
26
changes to key metric definitions, etc. Often times, these changes break important assumptions made about the data used in building key downstream data products
27
27
like reporting dashboards or data-driven product features.
28
28
@@ -49,7 +49,7 @@ Note that an Ingestion Source _must_ be configured with the data platform of you
49
49
tab.
50
50
51
51
> Note that SQL Assertions are not yet supported if you are connecting to your warehouse
52
-
> using the DataHub CLI or a Remote Ingestion Executor.
52
+
> using the DataHub CLI.
53
53
54
54
## What is a Custom SQL Assertion?
55
55
@@ -120,7 +120,7 @@ The **Assertion Description**: This is a human-readable description of the Asser
120
120
`Edit Assertions`, `Edit Monitors`, **and the additional `Edit SQL Assertion Monitors`** privileges for the entity. This is granted to Entity owners by default.
121
121
122
122
2.**Data Platform Connection**: In order to create a Custom SQL Assertion, you'll need to have an **Ingestion Source** configured to your
123
-
Data Platform: Snowflake, BigQuery, or Redshift under the **Integrations** tab.
123
+
Data Platform: Snowflake, BigQuery, Redshift, or Databricks under the **Integrations** tab.
124
124
125
125
Once these are in place, you're ready to create your Custom SQL Assertions!
Copy file name to clipboardExpand all lines: docs/managed-datahub/observe/volume-assertions.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ If the answer is yes, how did you find out? We'll take a guess - someone looking
22
22
a number looked a bit out of the ordinary. Perhaps your table initially tracked purchases made on your company's e-commerce web store, but suddenly began to include purchases made
23
23
through your company's new mobile app.
24
24
25
-
There are many reasons why an important Table on Snowflake, Redshift, or BigQuery may change in its meaning - application code bugs, new feature rollouts,
25
+
There are many reasons why an important Table on Snowflake, Redshift, BigQuery, or Databricks may change in its meaning - application code bugs, new feature rollouts,
26
26
changes to key metric definitions, etc. Often times, these changes break important assumptions made about the data used in building key downstream data products
27
27
like reporting dashboards or data-driven product features.
28
28
@@ -50,7 +50,7 @@ Note that an Ingestion Source _must_ be configured with the data platform of you
50
50
tab.
51
51
52
52
> Note that Volume Assertions are not yet supported if you are connecting to your warehouse
53
-
> using the DataHub CLI or a Remote Ingestion Executor.
53
+
> using the DataHub CLI.
54
54
55
55
## What is a Volume Assertion?
56
56
@@ -140,7 +140,7 @@ Volume Assertions also have an off switch: they can be started or stopped at any
140
140
`Edit Assertions` and `Edit Monitors` privileges for the entity. This is granted to Entity owners by default.
141
141
142
142
2.**Data Platform Connection**: In order to create a Volume Assertion, you'll need to have an **Ingestion Source** configured to your
143
-
Data Platform: Snowflake, BigQuery, or Redshift under the **Integrations** tab.
143
+
Data Platform: Snowflake, BigQuery, Redshift, or Databricks under the **Integrations** tab.
144
144
145
145
Once these are in place, you're ready to create your Volume Assertions!
146
146
@@ -238,7 +238,7 @@ As part of the **Acryl Observe** module, Acryl DataHub also provides **Smart Ass
238
238
dynamic, AI-powered Volume Assertions that you can use to monitor the volume of important warehouse Tables, without
239
239
requiring any manual setup.
240
240
241
-
If Acryl DataHub is able to detect a pattern in the volume of a Snowflake, Redshift, or BigQuery Table, you'll find
241
+
If Acryl DataHub is able to detect a pattern in the volume of a Snowflake, Redshift, BigQuery, or Databricks Table, you'll find
242
242
a recommended Smart Assertion under the `Validations` tab on the Table profile page:
0 commit comments