Additional Features
Data Catalog
There are a number of operations that can be performed from the Data Catalog that are related to Data Quality:
Add to Data Quality Rules: Utilize this option to add the selected data object to existing Data Quality Rules. This feature is supported for Tables, Table Columns, Files, File Columns, and Codes.
Remove from Data Quality Rules: This option facilitates the removal of the selected data object from existing Data Quality Rules. It is supported for Tables, Table Columns, Files, File Columns, and Codes.
Anomaly Detection Settings: This feature provides object-level settings for the Detection of Data Anomalies. This feature is supported for Schemas, Tables, and Table Columns.
Update Threshold Score: This feature allows users to set threshold values for the Data Quality Score. Clicking on this option opens a pop-up where users can update the Accepted Score and Optimal Score for a specific data object. As a result of this configuration, the Data Quality Score falling within different threshold ranges is displayed with distinct color codes.
Load Metadata from Files
Two templates are available for uploading rules and functions through load metadata from files.
Data Quality Rules: Data Quality Rules can be configured with the help of the Data Quality Rule Load Metadata from Files Template. To select the Data Quality Rules, navigate to Advanced Tools > Load Metadata from Files > Select Data Quality Rule Template.
Data Quality Functions: User-defined Data Quality Functions can be configured using the Data Quality Function Load Metadata from Files Template, allowing users to create their own Data Quality Function and associate objects with them for Data Quality Rule execution. This provides users with significant flexibility to customize Data Quality Functions according to their specific requirements and business standards. To select the Data Quality Rules, navigate to Advanced Tools > Load Metadata from Files > Select Data Quality Function Template.
Data Quality Rule APIs
OvalEdge supports thirteen types of APIs, which are centered around Data Quality and perform respective operations, listed below:
Updating Data Quality Rule Status
Executing Data Quality Rule
Updating Data Quality Rule Schedule
Adding/Updating Data Quality Rule
Adding Objects to a Data Quality Rule
Adding File Object Parameters in a Data Quality Rule
Fetching Associated Objects from a Data Quality Rule
Fetching Custom Fields from a Data Quality Rule
Fetching a list of Data Quality Rules
Fetching a single Data Quality Rule
Deleting a Data Quality Rule
Deleting Data Quality Rule Schedule
Deleting Associated Objects from a Data Quality Rule
Notifications
Data Quality Rule - Notify On Failure
When a data quality rule fails
Steward and Contacts
Data Quality Rule - Notify On Success
When a data quality rule is successful
Contacts
Data Quality Rule - Notify On Edit
When a data quality rule is edited
Contacts
Anomaly Detected
When an anomaly is detected during the recent profiling
Object Custodian
Remediation Alert
When a failed value is reported in the remediation center
Object Custodian
Mentions
When a user is mentioned in the collaboration
Mentioned User
Data Quality Rule Recommendation Job
When a job execution is completed on the Rule Recommendation Model.
Executed User
System Settings
The Data Quality tab in System Settings includes all the relevant settings related to data quality.
anomaly.detection.analysis.algorithm.list
DeviationIQR
To configure anomaly detection analysis algorithms.
Parameters:
Enter the names of the algorithms separated by commas. Currently, we support the 'iqr' and 'deviation' algorithms."
anomaly.detection.analysis.algorithm.selection
IQR
These values represent the algorithm that will be used for anomaly detection on different objects.
Parameters:
Enter the algorithm that should be considered to identify anomalies.
anomaly.detection.analysis.deviation.threshold
50
This value represents the threshold percent above or below which an anomaly will be generated for the rate of change in the data series.
anomaly.detection.analysis.enabled
TRUE
To activate or deactivate the anomaly feature within the application.
Parameters:
If set to "True," the anomaly feature gets activated, allowing it to function as intended.
If set to "false", the anomaly feature gets deactivated, and the connector and data object level anomaly-related settings will appear grayed out.
anomaly.detection.analysis.iqr.range
1-50
This numerical value represents the threshold percentage of data change (positive or negative) that, if exceeded, will trigger an anomaly.
anomaly.detection.default.assignee
data.quality.provider.type
Internal
To switch the DQR calculation between Internal (OvalEdge) and External (connector) modes.
Parameters:
The default value is set to internal.
If the configuration is set to Internal, then the DQR (Data Quality Rules) performed in the OvalEdge application is displayed in the dashboard under the Data Quality section.
If the setting is set to External, the scores are calculated based on the external connector inputs.
dataquality.associatedobjects.additionalcolumns
20
To set a limit on the number of parameters that can be configured within Additional Column details for the Data Quality Center. For instance, when set to a value such as 10, it enforces a maximum limit of 10 parameters that can be configured for the additional columns.
Parameters:
Enter the value in the field provided. The maximum limit is 20.
dataquality.associatedobjects.max.limit
1000
Set the maximum number of data objects that can be associated with a DQ Rule.
Parameters:
The default value is set to 1000.
Enter the value in the field provided.
dataquality.remediationcenter.failedrecords.max.limit
50
Configure the maximum number of failed results of the DQR (Data Quality Rule) to be displayed in the Control Center Tab.
Parameters:
The default value is set to 50.
Enter the value in the field provided to configure the new maximum limit.
dq.dqprinciple.visible
False
To show/hide DQ Principle column in the Results tab of Custom SQL functions showcasing the entered principle value from the stats query.
Parameters:
If set to true, the DQ Principle column will be made visible.
If set to False, the DQ Principle column will be hidden.
dq.object.filter.maxlength
500
To configure the maximum allowable length for text filters:
Parameters:
The default is set to 500.
Enter the value in the field provided.
dqf.querytemplate.modification.enable
False
Configure system-generated queries across connectors for specific object types and functions.
Parameters:
The default value is False.
If the value is set to True, users should be able to modify the system-generated query templates.
If the value is set to False, users will not be able to configure the system-generated query templates.
Note: Changes made to system query templates are reflected across the selected connectors. Please save any user written queries at the object level before making modifications.
dqr.databricks
(Empty)
To process Data Quality Rules for Databricks connection
Parameters:
The default value is empty.
If set to Yes, the DQR is processed for Databricks connection.
If set to No, the DQR is not processed for Databricks connection.
loadmetadata.dqr.update.dataset
False
Configure to enable or disable editing of queries that have been added to the application using Load Metadata from Files.
Parameters:
The default value is False.
If set to True, users can edit the queries that were loaded into the application using Load Metadata from Files.
If set to False, users will not be able to edit the queries loaded into the application using Loading Metadata from Files.
Last updated
Was this helpful?

