When a record is inserted or updated in a D365 F&SCM table for which data entry workflow SQL triggering is set up, the table event (Insert or Update) is logged in the Data synchronization log.
To start a data entry workflow, you must process the logged table events from the data synchronization log. Usually, you process this in a recurring batch. The processing is done based on the Workflow template SQL trigger assignment.
As a result, the first data entry workflow task is created and assigned based on the data entry workflow template configuration.
This topic explains how to process the data synchronization log.
Note:
For each SQL trigger that is defined and created in the database, all related table events are logged in the Data synchronization log.
Only the table event is logged and used for further processing. So, in case of an update, an event is logged of type Update. However, the changed data is not marked as changed and not used for further processing. For example, if you change a customer address, an event is logged that the customer is updated. However, the address change as such is not logged. When, for the logged event, a data entry workflow is started, the then current data of the customer is used.
If a logged event is processed, it is deleted from the data synchronization log.
1. | Go to Data entry workflow > Inquiries > Data synchronization log. |
2. | Click Process. |
3. | To improve the performance when processing a lot of logged events, you can use paging. For paging, the logged events are split over several pages that are run in parallel batch tasks. It can be that you can run only a limited number of batch jobs in parallel or that you want to use a limited number of batch jobs. If so, define the maximum allowed number of pages. |
  | In the Number of pages field, enter a number. |
  |
Note: If the number of records to be processed is less than the defined 'Number of pages' x 'Page size', less than the defined number of pages are processed. |
4. | You can split the records to be processed over several pages (batch tasks). Define the number of events to be processed by one batch task (page). |
  | In the Page size field, enter a number. |
  |
Note: If the number of records to be processed exceeds the value of 'Number of pages' multiplied by 'Page size', and 'Limit records' is set to:
|
5. | You can limit the number of records that is processed at once. This is only applied when the number of records to be processed exceeds the defined 'Number of pages' x 'Page size'. If Yes, the maximum number of records that is processed is the defined 'Number of pages' x 'Page size'. So, the number of records per page does not exceed the page size. |
  | Select Yes in the Limit records field. |
6. | Usually, all records in the data synchronization log are processed. However, you can also process a number of oldest records only. For example, if you fill in '100', only the 100 oldest records are processed. This can be helpful to investigate if issues occur when processing older records. |
  | In the Oldest set of field, enter a number. |
  |
Note: If you enter a number in the 'Oldest set of' field, these fields are ignored:
|
7. | Sub-task: Set up batch processing. |
7.1 | Usually, you process the data synchronization log in batch in a recurring pattern. |
  | Expand the Run in the background section. |
7.2 | Select Yes in the Batch processing field and fill in the other fields as desired. |
7.3 | Set up the recurrence for the data synchronization log processing. |
  | Click Recurrence and fill in the fields as desired. |
7.4 | Click OK. |
8. | Click OK. |
You can also process specific data synchronization log records. To do so:
Go to Data entry workflow > Inquiries > Data synchronization log.
Select the records to be processed.
Click Process selection.
Related to | Notes |
---|---|
Set up data entry workflow SQL triggers |
  |
Set up data entry workflow SQL triggers |
  |
Manage workflows |
  |