When you manually create or update a record, data quality policy rules can be applicable.

These data quality policy rules can be applied:

Rule type Description
Validation rules

Check if the data is in line with the defined standards.

Validation rules are applied automatically when you manually enter or update a record. 

If data quality policy execution logging is enabled, validation rule execution results are logged. You can review and process the logged validation rule execution results.

Duplicate check rules -

Basic matching

Check if the data doesn't already exist in the table, based on a combination of table fields.

Duplicate check rules of type Basic matching are applied automatically when you manually enter or update a record.

If merging duplicates is enabled for basic matching, and duplicates are found, you can compare the duplicates and merge field values.

Duplicate check rules -

Fuzzy matching

Check if the data doesn't already exist in the table, applying fuzzy logic.

To apply duplicate check rules of type Fuzzy matching, you must manually start the check.

If duplicates are found, you can compare the duplicates and merge field values.

Enrichment rules

(or action rules)

Set field values in a target field in D365 F&SCM.

Enrichment rules (or action rules) are applied automatically when you manually enter or update a record.

 

Note:

The data quality policy rules are applied on entering or updating a record only:

  • If the 'Form' execution trigger is enabled for the data quality policy.
  • For active data quality policy versions.
  • To the applicable data.


Data quality administrator Data quality administrator The data quality administrator (DQSDataQualityAdministrator) can set up and maintain: Data quality policies Data quality studio parameters Data quality studio general setup Start Start Which type of data  quality check  is done? Which type of data  quality check  is done? Is data quality policy  execution logging  enabled? Is data quality policy  execution logging  enabled? Monitor validation history Monitor validation history In the Data quality studio parameters, you can enable logging of data quality policy validation rules execution. Validation rule execution results are Logged only when you manually add or change records. Each time a validation rule is executed, an entry is logged. Not logged if the data quality policy is run by a: Data quality assessment. Connectivity Studio data import. You can review and process the logged validation rule execution results. For each logged validation rule execution result, you can: Set the review status. Open the source record to make changes, if desired. The validation status of a logged validation rule execution result can be: Success: The validation rule is met. Failed: The validation rule is not met and results in an error or warning. Skipped: The validation rule is not executed because its conditions are not met. Procedure 1. Go to Data quality studio > Inquiries > Policy execution logs. 2. In the list, find and select the desired record. 3. Sub-task: Change review status to 'In progress'. 4. Click Change status to open the drop dialog. 5. In the Review status field, select '0'. 6. Click OK. 7. Click Yes. 8. Sub-task: Review and edit source record. 9. In the list, find and select the desired record. 10. Click Open source record. 11. Click Edit. 12. Close the page. 13. Sub-task: Change review status to 'Accept deviation' or 'Completed'. 14. In the list, find and select the desired record. 15. Click Change status to open the drop dialog. 16. In the Review status field, select 'Accept deviation' or 'Completed'. 17. In the Reason code field, enter or select the desired reason code. 18. In the Reason notes field, type a value. 19. Click OK. 20. Click Yes. 21. Close the page. Is merging duplicates  enabled for  basic matching? Is merging duplicates  enabled for  basic matching? Review and merge duplicates Review and merge duplicates A duplicate check rule of type 'Basic matching' checks if the data doesn't already exist in the table, based on a combination of table fields. Duplicate check rules of type 'Basic matching' are applied automatically when you manually enter or update a record. If merging duplicates is enabled for basic matching, and duplicates are found, you can compare the duplicates and merge field values. To enable merging duplicates is for basic matching, in the Data quality studio parameters, on the General tab, set the Enable duplicate check using basic matching field to Yes. This topic explains how to handle possible duplicates that are found if merging duplicates is enabled for basic matching. Procedure 1. Go to a page to which a duplicate check rule of type 'Basic matching' applies. 2. Enter or edit a record. 3. Click Save. 4. Review the possible duplicates that are found by the basic matching duplicate check. 5. Sub-task: Merge field values of duplicate records. 6. In the Duplicates found section, click Proceed to merge. 7. For the desired record, select the Master record check box. 8. Select the desired 'merge' check boxes. 9. Click Merge. 10. Click Yes. 11. Close the page. Notes If the duplicate check rule Outcome field is set to Warning and a duplicate record is found, the changed or new record is saved. If you have merged field values to a master record, the found duplicate records are not deleted automatically. The main reason is that the found duplicate records can be referenced in other records. Therefore, if desired, you can delete the undesired records manually. Do so from the record page. Check for, review, and merge duplicates Check for, review, and merge duplicates A Fuzzy matching duplicate check checks for duplicates applying fuzzy logic. A Fuzzy matching duplicate check compares, for a selected record, several field values with the values of the same fields of other records. When duplicate values are found in another record: The duplicate score is calculated. The duplicate score is calculated based on the field weightage as defined for the duplicate check. The calculated duplicate score is compared with the threshold as defined for the duplicate check. If the duplicate check is equal to or higher than the threshold, the record is reported as possible duplicate. Example: Duplicate check on CustTable Threshold: 50% Table name Datasource name Field Field label Weightage CustTable CustTable AccountNum Customer account   CustCustomerV3Entity CustCustomerV3Entity AddressStreet Street 1 CustCustomerV3Entity CustCustomerV3Entity AddressZipCode ZIP/postal code 1 CustCustomerV3Entity CustCustomerV3Entity OrganizationName Organization name 6 CustCustomerV3Entity CustCustomerV3Entity PrimaryContactEmail Primary email 3 CustCustomerV3Entity CustCustomerV3Entity PrimaryContactPhone Primary phone 3 Calculation examples: Duplicate values exist in the Primary email field and in the Primary phone field. The duplicate score is: 6 / 14 * 100 = 42,86. The record is not reported as possible duplicate. Duplicate values exist in the Organization name field and the ZIP/postal code field. The duplicate score is: 7 / 14 * 100 = 50. The record is reported as possible duplicate. Review and merge duplicates You can review the found duplicates. To solve duplicates, you can: Merge field values from the duplicate records to a chosen master record. Manually delete undesired duplicate records. Procedure 1. Go to the form from where you want to check if duplicates exist for a record. 2. In the list, find and select the desired record. 3. Start the fuzzy duplicate check. Only one duplicate check is done. This is the first found active Fuzzy matching duplicate check that: Applies to the main table of the form. Is used in a duplicate check rule of an active data quality policy. Click Check for duplicates. Note: On the applicable form, on the Action Pane, depending on the setup, the Check for duplicates button can be shown: On the 'Data quality' tab, in the 'Duplicate check' button group. As a separate button. On an existing action pane tab, in the 'Duplicate check' button group. 4. Review the possible duplicates that are found by the fuzzy duplicate check. 5. Sub-task: Merge field values of duplicate records. 6. In the Duplicates found section, click Proceed to merge. 7. For the desired record, select the Master record check box. 8. Select the desired 'merge' check boxes. 9. Click Merge. 10. Click Yes. 11. Sub-task: Delete undesired duplicate records. 12. In Duplicates found section, in the list, find and select a duplicate record that you want to delete. 13. Click to follow the link in the Identifier field. 14. Click Delete. Note: Before you delete a record, make sure it is not referenced in another record. 15. Click Yes. 16. Close the page. 17. Close the page. 18. Close the page. Notes If you run a quality assessment, duplicate checks of type 'Fuzzy matching' are done as well. If a duplicate record is found, in the Quality assessment results, a warning is shown for the record. The message shows the number of duplicate records found. End End Yes No, only a message is shown  if the validation rule is not met Validation rule Duplicate check rule -  Basic matching Duplicate check rule -  Fuzzy matching Yes No, only a message is shown  if a duplicate record exists

Activities

Name Responsible Description

Monitor validation history

Data quality administrator

In the Data quality studio parameters, you can enable logging of data quality policy validation rules execution.

Validation rule execution results are

  • Logged only when you manually add or change records. Each time a validation rule is executed, an entry is logged.
  • Not logged if the data quality policy is run by a:
    • Data quality assessment.
    • Connectivity Studio data import.

You can review and process the logged validation rule execution results.

For each logged validation rule execution result, you can:

  • Set the review status.
  • Open the source record to make changes, if desired.

The validation status of a logged validation rule execution result can be:

  • Success: The validation rule is met.
  • Failed: The validation rule is not met and results in an error or warning.
  • Skipped: The validation rule is not executed because its conditions are not met.

Review and merge duplicates

Data quality administrator

A duplicate check rule of type 'Basic matching' checks if the data doesn't already exist in the table, based on a combination of table fields.

Duplicate check rules of type 'Basic matching' are applied automatically when you manually enter or update a record.

If merging duplicates is enabled for basic matching, and duplicates are found, you can compare the duplicates and merge field values. To enable merging duplicates is for basic matching, in the Data quality studio parameters, on the General tab, set the Enable duplicate check using basic matching field to Yes.

This topic explains how to handle possible duplicates that are found if merging duplicates is enabled for basic matching.

Check for, review, and merge duplicates

Data quality administrator

A Fuzzy matching duplicate check checks for duplicates applying fuzzy logic. A Fuzzy matching duplicate check compares, for a selected record, several field values with the values of the same fields of other records.

When duplicate values are found in another record:

  1. The duplicate score is calculated. The duplicate score is calculated based on the field weightage as defined for the duplicate check.
  2. The calculated duplicate score is compared with the threshold as defined for the duplicate check.
  3. If the duplicate check is equal to or higher than the threshold, the record is reported as possible duplicate.

Example:

Duplicate check on CustTable

Threshold: 50%

Table name Datasource name Field Field label Weightage
CustTable CustTable AccountNum Customer account  
CustCustomerV3Entity CustCustomerV3Entity AddressStreet Street 1
CustCustomerV3Entity CustCustomerV3Entity AddressZipCode ZIP/postal code 1
CustCustomerV3Entity CustCustomerV3Entity OrganizationName Organization name 6
CustCustomerV3Entity CustCustomerV3Entity PrimaryContactEmail Primary email 3
CustCustomerV3Entity CustCustomerV3Entity PrimaryContactPhone Primary phone 3

Calculation examples:

  • Duplicate values exist in the Primary email field and in the Primary phone field. The duplicate score is: 6 / 14 * 100 = 42,86. The record is not reported as possible duplicate.
  • Duplicate values exist in the Organization name field and the ZIP/postal code field. The duplicate score is: 7 / 14 * 100 = 50. The record is reported as possible duplicate.

Review and merge duplicates

You can review the found duplicates.

To solve duplicates, you can:

  • Merge field values from the duplicate records to a chosen master record.
  • Manually delete undesired duplicate records.

See also

Provide feedback