Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message.

On a message you define:
  • The source information: the connector and document that is used for the data source.
  • The target information: the connector and document that is used for the data target.
  • The record mapping and field mapping that define which data goes where in which format.
This flow explains how to set up the message header.


Application Consultant Application Consultant Start Start Set up message Set up message Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message.This topic explains how to set up a message header. On a message header you define:The source information: the connector and document that is used for the data source.The target information: the connector and document that is used for the data target.General settings that define the behavior of the message. Procedure 1. Click Connectivity studio Integration Design. 2. Click New. 3. In the Message field, type a value. 4. In the Project field, enter or select a value. 5. Define the connector that connects to the data source. In the Source connector field, enter or select a value. Note: You can only select a connector that is defined for the applicable project. 6. Define the document that defines which data you want to get from the data source and in which format and structure. In the Source document field, enter or select a value. Note: You can only select a document that is defined for the applicable project and that is linked to the same application as the source connector. Make sure the source connector and source document have a compatible handler version. For example, if you use a source connector of type Web service, Blob storage, or Upload and download, the handler version of the related source document must be 'V3'. 7. Define the connector that connects to the data target. In the Target connector field, enter or select a value. Note: You can only select a connector that is defined for the applicable project. 8. Define the document that defines which data you want to move to the data target and in which format and structure. In the Target document field, enter or select a value. Note: You can only select a document that is defined for the applicable project and that is linked to the same application as the target connector. Make sure the target connector and target document have a compatible handler version. For example, if you use a target connector of type Web service, Blob storage, or Upload and download, the handler version of the related target document must be 'V3'. 9. Sub-task: Define history settings. 10. Expand the History section. 11. In the History type field, select an option. 12. In the Transaction level field, select an option. 13. Select Yes in the Create history report field. 14. By default, only the records with errors are stored. You can also store the successfully processed records in the history. Select Yes in the Store history field. Note: Usually, you only store history when you test a message. 15. On import, you can log which D365 FO data is changed during import. Changes are logged by field. Only the latest change is stored for each field. Select Yes in the Log changes field. Note: Usually, you only log changes when you test a message. 16. Sub-task: Define company-change settings. 17. Expand the Change company section. 18. In the Source field field, enter or select a value. 19. In the Transformation field, enter or select a value. 20. Sub-task: Define source-status update settings. 21. Expand the Update source status section. 22. In the Status field field, enter or select a value. 23. In the Processed status field, type a value. 24. In the Error status field, type a value. 25. Sub-task: Define performance-related settings. 26. Expand the Performance section. 27. You can define how batch tasks are run when a message is processed. If the type is: Sequential, batch tasks are processed in the defined sequence (files) or in the creation sequence (paging). Parallel, batch tasks are split over several threads to run in parallel. Parallel processing improves the performance of message processing. In the Type field, select an option. Note: Parallel processing is only applied if the message is run in batch. 28. If the source document is an external file-based document, and the performance type is Sequential, define the sequence in which the files are processed. In the File selection sort order field, select an option. Note: This field is only applicable if the source document is an external file-based document. 29. If you run an import message in batch, you can rerun the message automatically if an error occurred during a message run. You can apply one of these rerun options: No: The message is not rerun automatically. Rerun: The message is rerun for the full set of provided data. For example, a data file is processed again. Rerun from history: The records with errors, as stored in the message history, are rerun. In the Automatic retry field, select an option. Note: You can only rerun import messages that import data from a: File Service Bus queue Database Staging journal The rerun is only done if one of these errors occurred: TransientSqlError: The database connection is broken. Deadlock: The import process is locked due to other processes. Update conflict: The data to be updated is already changed. 30. Define the maximum number of attempts to rerun the message automatically. In the Max. automatic retry attempts for import field, enter a number. Note: The number of rerun attempts has an impact on the performance. You are advised to limit the number of rerun attempts. The default number of attempts is ´5´. However, in most cases, a maximum of ´3´ rerun attempts is sufficient. 31. Sub-task: Define custom settings. 32. Expand the Custom section. 33. In the Message type field, select an option. 34. For a message, several standard handler classes are available. You can use a customized handler class. To do so, extend a standard handler class. In the Handler field, enter or select a value. Note: For a list and description of the standard message handler classes, refer to the Notes section of this topic. 35. You can use a custom action menu item to manually start a message. You can run a message to either export or import data./p> To make this work: Create the custom menu item. The action menu item must be of type class and must be linked to the 'BisActionRunMessage' class. To avoid best practice warnings, in the menu item setup, define the permissions for the roles who can use the action menu item. As the action menu item is not linked to a RunBase class, you cannot run it in batch. Add the action menu item to the form from where you want to start the message. For example, the Sales orders form to export a sales order. Enter the name of the action menu item to the desired message in the Action menu item name field. For example, a message to export a sales order. As a result, for a selected record, you can manually run the message from the form to which you added the action menu item. In the Action menu item name field, type a value. Note: Example: on the Sales orders page, you can select a record and click the action menu item button. The message is processed, and the selected sales order is exported. 36. Sub-task: Define message owner. 37. Expand the Owner section. 38. In the Responsible field, enter or select a value. 39. Sub-task: Define EDI document flow. 40. Expand the EDI section. 41. For EDI, a document flow type must be assigned to each message. In EDI, you run the Process inbound EDI documents batch job for a specific document flow type, and optionally for a specific message. Each time the batch job is run, all document flows of the defined document flow type are processed. So, all messages, as defined for these document flows, are run. Each message that is run, picks up the relevant received EDI message files (if any) from the location as defined for the message connector and processes the data. For more information o documents flows, refer to the EDI documentation. In the Document flow field, select an option. Note: This field only applies to messages that are used for EDI. 42. Sub-task: Define splitting settings. 43. Expand the Split target per root record section. 44. For data export, it can be that a limited number of root records per file is allowed. You can define the maximum allowed number of records per file. If more records are included in one message run, these records are split over several files, based on the split quantity. For example, the split quantity is 1000 records, and 4500 records are processed. the records are split over five files. Another example: you can use this to post a sales order one-by-one to a webservice. For data import, it can be required to limit the number of root records in one journal. For example, it can be required that a journal has a maximum of 1000 lines. If the import has 6000 records, these are split over six journals. In the Split quantity field, enter a number. Note: When you have defined a split quantity and the message is run, it: Opens the target document to process the records till the split quantity is reached. Closes the target document. Runs the next message (if defined) till it is finished. Opens the target document to process the next records till the split quantity is reached. And so on. 45. You can define the next message to be run when a run of the current message is finished. When the current message run is finished, the defined next message is automatically started. For example, the current message imports to the staging journal. You can start the message that exports from the staging journal and imports to D365 FO when the current message is finished. This processes all approved and to-be-processed staging journals. In the Next message field, enter or select a value. Note: If a split quantity is defined, the next message is run before the next split quantity is processed. Notes For a message, these standard handler classes are available:BisMessageRunDirect: This handler class runs the message by reading and writing all the applicable data, taking into account the message, connector, and document settings. The other standard handler classes are an extension of this handler class.BisMdmMessageRunDirectV2: Use this handler class for Master data management messages. This is an improved version of the BisMdmMessageRunDirect handler class.BisMdmMessageRunDirect: Use this handler class for Master data management messages.BisMessageRunDirectSQL: Use this handler class if the message type is SQL. It gets the data from the D365 FO database and directly enters it into the external database. If the record exists in the external database, it is updated. If the record does not exist in the external database, it is inserted.BisMessageRunDirectInsert: You can use this handler class if the message type is SQL. It gets the data from the D365 FO database and directly enters it into the external database. It only inserts the records in the external database.BisMessageRunDirectEdi: Use this handler class for EDI messages to switch company based on the EDI history instead of the change company settings on the message header.BisMessageRunDirect_deleteSkip: Use his handler class to skip the error if a to-be-deleted record does not exist. The message continues with the next record without an error message. Use form mapping? Use form mapping? Use form mapping

Use form mapping

For a message, you can generate the record mapping and field mapping based on an external document and a recording of the applicable forms and fields in D365 FO.

The form mapping recording also records the data structure of the mapped D365 FO fields.
You can use this, for example, as a starting point for the message record mapping and field mapping.
 
@videoref:CS6

Use data synchronization? Use data synchronization? Set up data synchronization for message

Set up data synchronization for message

On export of data, the data synchronization setup defines which records are processed.
Data synchronization only applies to messages with an internal document as source document.
You can use these types of data synchronization:

Type Description
All This is the standard data synchronization. All records that are found, based on the source document setup, are exported. No data synchronization setup is required for this.
Table events You can use table events to log data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log. From here, the logged entries must be processed further to the Outbound message queue. Process the outbound message queue to run the messages to export the records.
Date range You can use a date range to export only the records that are changed or added since the latest message run.

@videoref:CS5

Use Data quality  studio integration? Use Data quality  studio integration? Set Data quality studio integration options Set Data quality studio integration options You can apply data quality policy rules on import of data into D365 FO with a Connectivity studio message. On the message header, define which types of rules are applied on data import with the message. For more information, refer to Apply data quality rules on data import with Connectivity studio. Procedure 1. Click Connectivity studio Integration Design. 2. On the Messages tab, in the list, click the link of the desired message. 3. Click the Header tab. 4. Expand the Data quality studio section. 5. Select Yes in the Action rules field. 6. Select Yes in the Validation rules field. 7. Select Yes in the Check duplicates field. 8. Close the page. Notes The Data quality studio integration rule options on the message header are only available if: Data quality studio is installed on the same D365 FO environment as Connectivity studio. On the Connectivity studio parameters, on the Data quality studio tab, the Enable data quality integration field is set to Yes. End End Yes No Yes No Yes No

Activities

Name Responsible Description

Set up message

Application Consultant

Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message.
This topic explains how to set up a message header. On a message header you define:
  • The source information: the connector and document that is used for the data source.
  • The target information: the connector and document that is used for the data target.
  • General settings that define the behavior of the message.

Use form mapping

Application Consultant

For a message, you can generate the record mapping and field mapping based on an external document and a recording of the applicable forms and fields in D365 FO.

The form mapping recording also records the data structure of the mapped D365 FO fields.
You can use this, for example, as a starting point for the message record mapping and field mapping.
 

Set up data synchronization for message

Application Consultant

On export of data, the data synchronization setup defines which records are processed.
Data synchronization only applies to messages with an internal document as source document.
You can use these types of data synchronization:

Type Description
All This is the standard data synchronization. All records that are found, based on the source document setup, are exported. No data synchronization setup is required for this.
Table events You can use table events to log data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log. From here, the logged entries must be processed further to the Outbound message queue. Process the outbound message queue to run the messages to export the records.
Date range You can use a date range to export only the records that are changed or added since the latest message run.

Set Data quality studio integration options

Application Consultant

You can apply data quality policy rules on import of data into D365 FO with a Connectivity studio message. On the message header, define which types of rules are applied on data import with the message.

For more information, refer to Apply data quality rules on data import with Connectivity studio.

Use form mapping

Set up data synchronization for message

Provide feedback