Activities

Activity Area Description

(Re)build 'Connectivity' business events

Design You can rebuild the business events of category 'Connectivity'. You can do so, for example, if:
  • Changes are made to the message business events. Changes can be, for example, when message business events are deleted or the 'Send business event' field is changed to 'Never'.
  • The connectivity setup, including the message business events are moved from a Test environment to a Production environment.
As a result:
  • All existing business events of category 'Connectivity' are deleted from the business events catalog.
  • Based on the message business events, business events are created. So, a business event is created for message business events that:
    • Had a related business event.
    • Didn't have a related business event.
Note:
  • If a message business event is deleted while a related business event exists, on rebuild, the related business event is deleted and not rebuilt.
  • A business event is not (re)built if for the message business event the 'Send business event' field is set to 'Never'.
  • If a 'Connectivity' business event is activated before rebuild, the activation stays after rebuild is done. However, if for a message business event the 'Business event ID' is changed, the activation is undone.
  • If Connectivity setup is moved from Test to Production, the endpoints must be set up, activated, and configured again.

Activate business event

Design
For several business events, you activate the business event:
  • In the Business events catalog in D365 FO. This, for example, applies to endpoints of type 'HTTP' or 'Azure Event Grid'.
  • From the endpoint. This, for example, applies to endpoints of type 'Microsoft Power Automate' or 'Dataverse'.
For more information, refer to:

Add batch class to task

Design

For each task, you can define several (custom) batch classes. If you run a task, all defined batch classes are run.

Add child record to outbound queue

Operation When the data synchronization log is processed, the Redirect event setting of the message or web service action is considered. Each table event that occurs to a record is logged in the data synchronization log for that record and related table.
On processing the data synchronization log, by default, log entries for a child record of the source document are added to the outbound queue for the root record of the source document. However, if Redirect event is set to 'No', only the logged event on the child record is added to the outbound queue. When the outbound queue is processed, the child record is exported. Based on the applicable document setup, the applicable parent records and the root record are also exported.
Example:
The source document has a root record 'Sales order' and a child record 'Sales line'. An event is logged for a sales line. If Redirect event is set to 'No', the sales line record is added to the outbound queue. When the outbound queue is processed, the sales line and the sales order are exported.
Note:
Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of:
- Record
- Message or web service action
- Event type
- State
If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue

Add document record field - JSON

Design

To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.

For JSON documents, fields are always of type string, to enable type conversions.

This topic explains how to add record fields to a JSON document.

If fields are already initialized for the document, selected for the record, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - EDI

Design To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.
For EDI documents, fields are always of type string, to enable type conversions.
This topic explains how to add record fields to an EDI document.
If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - Internal documents

Design To each document record, add the data fields which values must be exchanged. For internal documents, set up the fields in line with naming in D365 FO.
For internal documents, make sure the fields have the same type as in D365 FO.
This topic explains how to add records to documents of these types: D365 FO, Journal, or Staging.
If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - Microsoft Excel

Design To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.
For Microsoft Excel documents, fields can be of any type.
This topic explains how to add record fields to a Microsoft Excel document.
If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - Microsoft Word

Design To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.
For Microsoft Word documents, fields are always of type string, to enable type conversions.
This topic explains how to add record fields to a Microsoft Word document.
If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - Text

Design To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.
For Text documents, fields are always of type string, to enable type conversions.
This topic explains how to add record fields to a Text document.
If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document record fields - XML

Design

To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.

For XML documents, fields are always of type string, to enable type conversions.

This topic explains how to add record fields to an XML document.

If fields are already initialized for the document, selected for the record, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document records - EDI

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to an EDI document.

Add document records - Fixed text

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to a Fixed text document.

Add document records - Internal document

Design

To each document, add the data records to be exchanged. For internal documents, set up the records in line with how the data is structured and named in D365 FO.

This topic explains how to add records to documents of these types: D365 FO, Journal, or Staging.

Add document records - JSON

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to a JSON document.

If records are already initialized for the document, you can review and complete the setup for these records. To do so, instead of adding a record, select the desired record.

Add document records - Microsoft Excel

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to a Microsoft Excel document.

Add document records - Microsoft Word

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to a Microsoft Word document.

Add document records - ODBC

Design

To each document, add the data records to be exchanged. For ODBC documents, set up the records in line with how the data is structured and named in the external database.

This topic explains how to add records to an ODBC document.

Add document records - Text

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to a Text document.

Add document records - XML

Design

To each document, add the data records to be exchanged. For external file-based documents, set up the records in line with how the data is structured and named in the file.

This topic explains how to add records to an XML document.

If records are already initialized for the document, you can review and complete the setup for these records. To do so, instead of adding a record, select the desired record.

Add document records fields - Fixed text

Design To each document record, add the data fields which values must be exchanged. For external file-based documents, set up the fields in line with naming in the file.
For Fixed text documents, fields are always of type string, to enable type conversions.
This topic explains how to add record fields to a Fixed text document.
If fields are already selected for or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add document records fields - ODBC

Design To each document record, add the data fields which values must be exchanged. For ODBC documents, set up the fields in line with naming in the external database.
For ODBC documents, make sure the fields have the same type as in the external database.
This topic explains how to add record fields to an ODBC document.
If fields are already initialized for, selected for, or copied to the record, you can review and complete the setup for these fields. To do so, skip step 6.

Add file to Working folder

Design
If you want to test an import message with an external file-based source document, you need a file with test data to be imported. This file must be available in the Working folder that is defined for the source connector.
You can add the test data file to the Work folder in these ways:
  • Upload the file to the Working folder.
  • Copy or move a file from the Archive or Error folder to the Working folder.

Add mapping fields

Design

Set up the field mapping for each record mapping of the message. On the field mapping, you define which target document record fields are mapped to which source document record fields. The resulting field mapping is used to get the right data from the source and get it in the right format to the right place in the target.

To set up the field mapping, you can:

  • Add fields manually.
  • Select fields from a list.

You can only use the fields, as defined for the related record in the target document.

Add master data entity to task

Design

For each task, you can define several master data entities. If you run a task, all defined master data entities are run.

For more information on master data entities, refer to Master data management studio.

Add message to task

Design For each task, you can define several messages. If you run a task, all defined messages are run.

Add outbound web service action to task

Design

For each task, you can define several outbound web service actions. If you run a task, all defined outbound web service actions are run.

Add project to version management

Analysis

If version management is activated in the Connectivity studio parameters, you can add projects to version management. So, version management is only applicable to the projects that you added to version management.

If you add a project to version management, it is automatically checked out to make changes. To add the current project setup to version management as the first project version, check in the project. As a result, a version file is created and added to the folder as defined in the Business integration parameters.

Add root record to outbound queue

Operation
If an event is logged for the root record of the applicable source document, the Redirect event setting is not applicable. On processing the data synchronization log, the logged event on the root record of the source document is added to the outbound queue. When the outbound queue is processed, the root record is exported. If, in the applicable document, child records are defined for the root record, these child records are exported as well.
Note:
Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of:
- Record
- Message or web service action
- Event type
- Status
If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue.

Add root record to outbound queue (Redirect)

Operation When the data synchronization log is processed, the Redirect event setting of the message or web service action is considered. Each applicable table event, that occurs to a record, is logged in the data synchronization log for that record and related table.
On processing the data synchronization log, by default, logged events for a child record of the source document are added to the outbound queue for the root record of the source document. When the outbound queue is processed, the root record and its child records are exported.
Example:
The source document has a root record 'Sales order' and a child record 'Sales line'. An event is logged for a sales line. If Redirect event is set to 'Yes', not the logged event on the sales line is added to the outbound queue. Instead, the event is added to the outbound queue as logged on the sales order. When the outbound queue is processed, the sales order and all its sales lines are exported by the message.
Note:
Before a record is added to the outbound queue, a check is done if already a record exists for the unique combination of:
- Record
- Message or web service action
- Event type
- State
If, in the outbound queue, already a record exists for the unique combination, the record is not added to the outbound queue.


Analyze the comparison results

Design

As a result of a content packages comparison, all differences between the selected content packages are shown on the Compare page.

Compare results

On the Compare page, the Compare results tree shows where differences between the compared content packages exist. The Compare results tree has these types of entries: 

  • Table: This is the highest level in the Compare results tree. It represents a table with records for which data differences exist between the compared content packages. 
    The entry name is defined by the target record name in the applicable message mapping. 
    The number of data differences for the table is shown in parentheses. 

  • Record: This is the lowest level in the Compare results tree. It represents a record for which data differences exist between the compared content packages. 
    The entry name is defined by the key as set up in the applicable message field mapping. 
    The number of data differences for the record is shown in parentheses. 

Possible differences

For the selected entry in the Compare results tree, the related differences are shown in the Results pane. In the Results pane, for each difference, one of these messages is shown to describe the difference: 

  • Field values are different 

  • Field does not exist in source 

  • Field does not exist in target 

  • Record only exists in source 

  • Record only exists in target 

Note:

 "Field does not exist in source" and "Field does not exist in target" messages do not imply that the original data tables do not have the specified field. It implies that the applicable message mapping removed this field from the resulting XML file.

Analysis

Use the Source value and Target value fields to analyze the difference between the field values. 

You can choose one of these options to indicate how to solve a difference:  

  • Synchronized manually: You want to set the field value manually. You can do so in the resulting XML file. 

  • Use source value: Use the value from the source content package as final value. 

  • Use target value Use the value from the target content package as final value. 

You can use the View source record and View target record buttons to view the full selected record details in XML format. The shown data is retrieved from either the source content package or the target content package. 

Analyze tracer

Design When you run a message for testing purposes, you can use a tracer to register what the message does when it is run. When the message has run using the tracer, you can review the registered actions in the tracer. Usually, you use this to find issues in the message run.

Apply custom expression (Field option)

Design

You can use an expression to modify the value. In the expression, use source fields, target fields, or mapping variables as variables.
As a variable in the expression, you can use a:

  • Field from any record of the applicable document. If you use a field from another record than the current record, make sure this record is mapped before the current record.
  • Mapping variable that is defined for the message.

Examples:

You can use an expression to:

  • Combine two fields. You can, for example, make a contact location name specific by combining it with the purpose. The expression can be: Description+'-'+Purpose. This results in, for example: Shop Main Street-Financial.
  • Calculate the unit price based on the quantity and line amount on a purchase order confirmation. The variables are from the source record: Qty (PurchLine, PurchQty field) and Amount (PurchLine, LineAmount field). The expression is: Amount/Qty.

Note: 

  • Usually, you use string and real values as variables.
  • Date calculations are not supported.
For more information on expressions, refer to: Expression.

Apply custom handler (Field option)

Design

You can use a handler class to set the target field value. Several handler classes are available. For example, to get the current date, to add the time to a date, or to get the current company.

Apply dimension set (Field option)

Design

On D365 FO tables, a financial dimension is expressed in a RecId to a financial dimensions table. So, it does not reflect the financial dimension name and value. You can use the Dimension set option to get or set the financial dimension value based on the RecId and the dimension name (or number) as defined for the field option. If the message is used to:

  • Import, you receive the dimension value. Based on the RecId in the target field and the name (or number), the value is set in the relevant financial dimensions table. Note: On import, the target field type must be other than String.
  • Export, based on the RecId in the source field and the name (or number), the dimension value is gotten from the relevant financial dimensions table. Note: On export, the source field type must be int64.

Note:

Only use the Dimension set field option if one of these fields is part of the field mapping:

  • DefaultDimension: This field is used to store default dimensions for, for example, customers, vendors, or customer groups.
  • LedgerDimension or OffsetLedgerDimension: These fields are used to store financial dimensions for ledger journals of the 'Main account' account type.
  • DefaultDimension or OffsetDefaultDimension: These fields are used to store financial dimensions for ledger journals of other account types.
This field must be the:
  • Target field of the field mapping for imports.
  • Source field of the field mapping for exports.
For each financial dimension that you want to set or get, use a separate field mapping. So, in the field mapping, you can have the same financial dimension field several times.

Apply display method (Field option)

Design

You can use a display method to get the applicable value. For example, you can use a display method to calculate a value.

Apply edit method (Field option)

Design

You can use an edit method to set the target field value. On import, the edit method changes the value in the target field of the D365 FO table.

Apply external code

Design

In a field mapping, you can apply an external code as defined for the related entity.

As a result, on:

  • Import, the external code is replaced with the related code in your D365 FO environment.
  • Export, the code in your D365 FO environment is replaced with the related external code.

You can only use the external code setup in messages that are run in EDI studio.

To use the external code functionality, in EDI studio, additional external code setup is required for the EDI parties or EDI groups. For each EDI party or EDI group, you can define an external code definition as set up for a:

  • Currency
  • Released product
  • Charge codes of miscellaneous charges
  • Modes of delivery
  • Terms of delivery
  • Unit

Apply external reference (Field option)

Design

You can link an external ID to a record ID in D365 FO. Together with the external ID, you can also link an external revision number to a record ID.
The application, as defined for the connector, is used to store the link between the reference table and record ID in D365 FO and the external ID and revision number.
If you:

  • Export data, the applicable external ID or revision is searched for in the target connector application. If a record exists for the reference table and the source field (RecId) combination, the external ID or revision of this record is as the output value of this option.
  • Import data, the applicable target record ID is searched for in the source connector application. If a record exists for the combination of reference table and external ID and revision, the reference record ID of this record is the output value of this option. If no record exists for the combination, a record is created. The reference record ID of this record is the output value of this option.

Note: 

  • You can only link an external ID or revision number to a record ID if, on the applicable document, the External reference field and the Revision field are filled in for the source field or target field of the current field mapping.
  • You can only link a revision to a record ID in D365 FO if you also link the external ID. The external ID field mapping must precede the revision field mapping.
  • Usually, you use the External reference option on a field mapping where the source field (export) or target field (import) is the RecID.

Apply inventory dimension (Field option)

Design

On several D365 FO tables, inventory dimensions exist. Each inventory dimension is expressed in a RecId that refers to the Inventory dimensions (InventDim) table. So, it does not reflect the inventory dimension name and value. You can use the Inventory dimension option to get or set the inventory dimension value based on the RecId and the dimension name as defined for the field option.

  • On import, you receive the dimension value. Based on the RecId in the target field and the selected inventory dimension, the value is set in the InventDim table.
  • On export, based on the RecId in the source field and the selected inventory dimension, the dimension value is gotten from the inventory dimensions table. Note: On export, the source field type must be int64.

Note: 

  • Only use the Inventory dimension field option if the InventDimId field is the target field of the field mapping.
  • For each used inventory dimension, add a separate field mapping. For example, if you use five inventory dimensions on a sales line, add five field mappings for the sales line target record. Each of these field mappings has the InventDimId field as target field. For each InventDimId field mapping, select the Inventory dimension field option and select the applicable inventory dimension.

Apply ledger (Field option)

Design

To exchange journal data, you can get or set the account number for ledger transactions as main account (Default) or using another field (Ledger) to indicate if the value is, for example, a project, customer, or main account.

Note: If the account type is Main account, only use the Ledger field option if one of these fields is part of the field mapping: LedgerDimension or OffsetLedgerDimension.

Apply lookup (Field option)

Design You can use a lookup to get a value from another table and use it as the output value of this option. For example, you can get a value from a table that is not in the source document.
The other table must have the source field as the single key field. The current value of the 'Modify a value' process is the input of the lookup. As a result, the value that is returned by the lookup is the value of the Return value field.
Before determining the output value of this option, you can apply a type conversion or a transformation.

Apply mapping variable

Design

During a message run, you can use mapping variables to temporary store values. You can write (calculated) values to a mapping variable, and later during the message run, read the value from the mapping variable. You can use mapping variables across records.
The mapping variable values are only stored during the message run. When the message run is finished, the mapping variable values are deleted.
During a message run, to read a mapping variable, on a field mapping, define the target field to be filled with the mapping variable value. Instead of a source field, use the 'Variable' field option to define the mapping variable which value must be used to set the target field value.

Apply number sequence (Field option)

Design

You can use a number sequence to get the field value. So, instead of the source field value, the next available number in the number sequence is the applicable value.

Approve staging journal

Design If all errors and warnings of a staging journal are solved, accepted, or canceled, approve the staging journal. The staging journal is again validated according to the applicable journal validation setup. If the applicable validation rules are met, the staging journal status is set to Approved. Approved staging journal can be further processed by the message that imports the staging journal records into D365 FO.

Auto fix errors

Design If errors are found by the automated error check, you can first try to have these errors fixed automatically. For example, an error that can be fixed automatically: The field length of an internal document record field does not match the field length in the related table. In this case, the document record field length is changed to the table field length.
You can auto-fix errors for:
  • Projects
  • Documents
  • Document records
  • Messages
  • Message - Data synchronization setup
  • Message mapping
  • Connectors
  • Web services
  • Web service - Data synchronization setup
In this activity, as an example, the steps explain how to auto-fix a document. Where applicable, notes are added to explain how to auto-fix the other types.

Change project or related components

Analysis If a project is checked out, you can make the required changes to the project or to the related components.

Check if event complies with source document setup

Operation
For each applicable message, a check is done if the record, for which the event is logged, complies with the source document of the message.
Checks are done if the record complies with, for example, the applicable document record ranges and join modes.

Check in a project

Analysis

If version management is active and a project is checked out to make changes, check in the project to make the changes generally available for other environments.

As a result, the changes to the project details and related components are stored as a new project version. The project version is stored as a file in the file storage folder as defined in the Connectivity studio parameters.

Check out a project

Analysis If version management is active, you must check out a project to make changes to the project or its components.
You check out a project in an environment. Each user in this environment can make changes to the checked-out project. If a project is checked out, it cannot be checked out in another environment.
You can only check out the latest project version. If you check out a project and the active version is not the latest version, you can choose to first get the latest version. If you do not get the latest version, the project is not checked out.

Check relations

Design

For internal documents and ODBC documents, the relation between the tables of a parent record and a child record are important. The relation is required to be able to query from parent to child records in the record tree.

Check or define the relation between the table for the current record and the table of the parent record.

Note: If you define the parent record for an internal document and a table relation exists with the current records table, it automatically adds the first-found relation to the current record. If the relation fields are not yet added to the parent record or the current record, these are automatically added. For an ODBC document, you must set the relations manually.

Check setup

Design

If you have finished the setup, you can run a test to check for errors in the setup. You can do so for:

Key element Check
Projects When automatically checked, only the project setup is checked, and not the related setup like messages and connectors. When started manually, the full project setup is checked.
Documents The document setup is checked, including the document records setup.
Document records Only the document records setup is checked.
Messages The message setup is checked, including the data synchronization setup and message mapping.
Message - Data synchronization setup Only the data synchronization setup is checked.
Message mapping Only the message mapping is checked.
Message business events A check is done if a business event is created for the message business event. Also, a check is done if the target fields and source fields match with the related document setup.
Connectors Only the connector setup is checked.
Web services The web service setup is checked, including the data synchronization setup.
Web service - Data synchronization setup Only the data synchronization setup is checked.

If an error is found, in the message bar, a message is shown indicating the error.

If for an entity, an error exists or the setup is incomplete, an error icon  is shown. You can click the icon to show the related error in the message bar. 

In this activity, as an example, the steps explain how to check a document. Where applicable, notes are added to explain how to check the other types.

Clean up data changes log

Design

On a message header, in the Log changes field, you can indicate if data changed are logged. If set to 'Yes', on import, the D365 FO data that is changed during import, is logged. Changes are logged by field. Only the latest change is stored for each field.

The data changes are logged in the History change log (BisConHistoryChangeLog) table. To view the logged data changes for a message, on the Message page, on the Action Pane, on the Operations tab, click Show changes.

You can clean up the logged data changes manually or in recurring mode.

Clean up generic staging journal table

Design In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed.
One predefined generic staging journal table is available for Connectivity studio: 'BisStagingBufferOrderJournal'.
This topic explains how to clean up the 'BisStagingBufferOrderJournal' table.
You can clean up the staging journals manually or in recurring mode.
For example, you want to keep staging journals for six months. Each week, you can do a cleanup, deleting staging journals older than six months.

Clean up history tables

Deployment

Each integration run results in history records. History records can be logged for:

  • Projects
  • Tasks
  • Web service actions
  • Messages
  • Service Bus queues
  • Test cases
You can clean up the logged history manually or in recurring mode.
For example, you want to keep history records for six months. Each week, you can do a cleanup, deleting history records older than six months.
These tables are cleaned up:
  • BisBufferTable
  • BisConHistoryChangeLogTable
  • BisConnectorFileHistoryTable
  • BisConSBReceivedData
  • BisConSBSendData
  • BisFlowActionHistory
  • BisHistory
  • BisHistoryCalculation
  • BisHistoryEntity
  • BisMessageDebugger
  • BisProjectHistory
  • BisTaskHistory
  • BisTestCaseRunResultTable
  • BisTestCaseRunTable
  • BisWsAttributeHistory
  • BisWsHistory
  • BisWsWebRequestHistory
  • BisWsWebRequestHistory

Clean up outbound queue

Deployment

On processing the outbound queue, for each record, the related message or web service action is run to export the applicable data. A successfully processed outbound queue record gets the status Processed. 

You can clean up the outbound queue by deleting the records with the status Processed.

Clean up unused fields

Design If the document is linked to a message, you can clean up the fields. All fields that are not used in the message mapping are shown on a separate page. You can decide which of the unused fields you want to delete from the document.
All messages to which the document is linked are checked. Only the fields are shown as unused that are not used in any of the field mappings of the related messages.
You can use this, for example, if you have initialized the record or fields and all found fields are added to the document records.

Combine logged events of same record and type

Operation
In the data synchronization log, several events can be logged for a unique record and event type combination. To prevent processing duplicate events, all logged events for a unique record and event type combination are further processed as one event. Finally, this results in only unique records in the outbound queue.
Example:
A sales order is changed several times. So, several events of type Update are logged for the sales order. When the data synchronization log is processed, these events are first grouped on one page. When the page is processed, these events are combined into one event for the sales order for further processing.
Note:
If you split logged events over pages, combining events for a unique record and event type combination is done separately for each page.

Compare projects

Analysis

You can compare a project for reviewing purposes.

You can compare a:
  • Project with another project.
  • Project version with another version of the same project.
  • Project with a project that is exported to a file.
  • Project with a project that is available as a resource.
On the Compare projects page, the:
  • Target project is the current project.
  • Source project is the project with which you compare the current project.
A record that is shown in the comparison can have one of these statuses:
Status Description
Inserted by source The record exists in the source project (version), but not in the target project (version).
Inserted by target The record exists in the target project (version), but not in the source project (version).
Updated The record is updated in either the source or the target project (version).

Configure business events endpoint

Design For several endpoint types, you configure the business event endpoints:
  • In the Business events catalog in D365 FO. This, for example, applies to endpoints of type 'HTTP' or 'Azure Event Grid'.
  • From the endpoint. This, for example, applies to endpoints of type 'Microsoft Power Automate' or 'Dataverse'.
For more information, refer to Manage business event endpoints. The tutorials, as referred to in this topic, give more detailed information.

Configure the App Service web app

Design Before you can use the web site, as installed on the App Service web app, to connect to D365 FO, configure the web app.

Configure the IIS application

Design Before you can use the web site to connect to D365 FO, configure the IIS application.

Connect environment to Azure file share

Design
To exchange files with the environment to which you connect, you can give it access to the Azure file share.
To be able to access the Azure file share from another environment, you can mount the Azure file share with the other environment.
For more information, refer to Use an Azure files share with Windows.

Copy fields

Design You can copy fields from a record of another document. You can use this, for example, to save setup time if you use a specific record in several documents.
You can only copy fields:
  • From a record with the Record table defined.
  • To a record with no fields.
As a result, the full field setup is copied from the selected record to the current record.

Copy message

Design

You can create a copy of a message. For example, to troubleshoot a failing message that is already in operation, you can copy the message. So, you do not disrupt the operational data.

Create an Azure Logic App as inbound web service

Design Create an Azure Logic App that serves as inbound web service for D365 FO. For more information, refer to Azure Logic Apps documentation.
You can, for example, create a logic app that picks up files when created in OneDrive and trigger the applicable web service action in D365 FO.

Create an Azure Logic App as outbound web service

Design Create an Azure Logic App that serves as outbound web service for D365 FO. For more information, refer to Azure Logic Apps documentation.
You can, for example, create a logic app that posts files to OneDrive.

Create Azure Service Bus namespace

Design Create an Azure Service Bus namespace. For more information, refer to Create a Service Bus namespace using the Azure portal.
Copy the Primary connection string and the Primary key of the namespace to a temporary location for later use. For more information, refer to Get the connection string.

Create Azure SQL database

Design

You can use a connector of type Database to connect to an Azure SQL database.

For more information on Azure SQL databases, refer to Azure SQL Database.

Create Azure Storage Account

Design
 

You can use an Azure Storage Account to:

  • Exchange data files between your D365 FO environment and another environment, for example an on-premises environment. You use the Azure Storage Account on Azure file storage connector setup or on Blob storage connector setup.
  • Store general Connectivity studio files. For example, version management files and history report files. You use the Azure Storage Account on Connectivity studio parameter setup.
You can:
  • Create an Azure Storage Account.
  • Use the Azure Storage Account of your D365 FO environment. If a cloud-hosted D365 FO environment is created with LCS, automatically the related Azure Storage Account is created. The default name of this Azure Storage Account is the same as the D365 FO environment name. Note: when you delete the D365 FO environment, also the related Azure Storage Account is deleted.
For more information, refer to Create an Azure Storage Account.

Create Blob container

Design For the Azure Storage Account, create a Blob container where data files are uploaded to or downloaded from.
In the Blob container, create the folders that are required to exchange files. You can create folders that relate to the paths in the Properties section, the Read section, and the Write section of a connector of type 'Azure Blob storage':
  • Working
  • Archive
  • Error
  • Split/Temporary
  • Temporary
For more information, refer to Azure Blob storage.

Create content package

Design

In the Environment comparison studio, you can compare data using content packages.

To start a data comparison, you need two content packages. Generate each of these content packages based on the XML files in the applicable Azure Storage account folder from the desired company/environment combination.

The generation process:

  1. Creates a content package
  2. Gets the data from all XML files in the specified Azure storage account folder
  3. Adds this data to the created content package

You can compare the generated content package with another content package.

 

Create file share for Azure file storage connector

Design
For the Azure Storage Account, create a file share where data files are uploaded to or downloaded from.
For the file share, you can create the required folders.
You can create folders that relate to the paths in the Properties section, the Read section, and the File actions section of the connector:
  • Working
  • Archive
  • Error
  • Split
  • Copy
  • Move
For more information, refer to Create file share.

Create file share for version management and history reporting

General

For the Azure Storage Account, create a file share for the version management files and history reporting files.

For the file share, you can create the required folders.
Create folders that relate to the paths in these sections of the Connectivity studio parameters:
  • Version management section: Shared path
  • History section: History report path
For more information, refer to Create file share.

Create mapping variables

Design

During a message run, you can use mapping variables to store values. You can write (calculated) values to a mapping variable, and later during the message run, read the value from the mapping variable. You can use mapping variables across records.

The mapping variable values are only stored during the message run. When the message run is finished, the mapping variable values are deleted.

During a message run, to:

  • Write a mapping variable, on a field mapping, define the mapping variable as target variable, instead of using a target field. To set the mapping variable value, you can use several field mapping options, for example, a constant value, a custom expression, or a source field value. You can also use mapping variables on custom methods.
  • Read a mapping variable, on a field mapping, define the target field to be filled with the mapping variable value. Instead of a source field, use the Variable field options to define the mapping variable which value must be used to set the target field value.
  • Read a mapping variable for use in a custom expression, for the custom expression, create a variable with the Mapping variable field filled.

Create message business event

Design If you want to create a business event to be triggered by a message, first create and set up a message business event. A message business event is a business event definition that is used to create a business event in D365 FO.
For each message, you can only create one message business event.
The business event is triggered when the message run is done, taking into account the 'Send business event' setting.

Create Microsoft Word document from project

Analysis

To analyze a project, you can create a Microsoft Word document based on the project. A summary of the project setup and related components setup is added to the document. You can use the document to review the setup.

When created, the Microsoft Word document is downloaded to your local downloads folder.

To create the document, the Microsoft Word template is used that is defined in the Connectivity studio parameters.

Create on-premises Data Source Name (DSN)

Design Create a Data Source name (DSN) on the external on-premises server where you installed the BIS Azure Service Bus Windows service.
Note:
If you connect from D365 FO on-premises to an external on-premises server, no Azure Service Bus is required. In this case, the Data Source Name (DSN) must be created on the D365 FO server.

Create project

Analysis Use a project as the basis for each integration.
For each project, you can also define the subprojects. You can use the subproject functionality to run multiple projects at once. The subprojects are run after you run the project. You can, for example, use subprojects for data migration. In this case, you set up a separate project for each applicable D365 FO module, for example, Sales and marketing, Procurement and sourcing, and Production control. Also, set up an 'umbrella' project and link the module-specific projects to it as subprojects. To run the data migration, you run the 'umbrella' project and all subprojects are automatically run as well.

Create related record

Design
You can add a record based on an existing table relation in D365 FO.
You can only create a related record for:
  • Internal documents.
  • Fields that are part of a table relation.
Example:
The record CustTable, has a field CustGroup.
This CustGroup field is part of the table relation between the CustTable and CustGroup table.
Create a related record for the CustGroup field of the CustTable record.
As a result, the CustGroup record is created and added as a child record to the CustTable record. To the CustGroup record, the mandatory fields of the CustGroup table are added.

Create secret reference

Design

You can create secret references to store secrets at a central place in Connectivity studio. Wherever you need a secret in Connectivity studio, you can use a secret reference.

Benefits of using secret references are:

  • One place to maintain secrets. For example, you can use a secret in several connectors. If the secret expires, you only update the secret reference instead of updating the secret separately for each applicable connector.
  • When you use a secret reference, the secret is not shown or visible where the secret is applied. The secret is also not visible when you export or import project configurations.

Create task

Design You can use tasks to set up the execution of:
  • An integration or data migration.
  • Outbound web services.
  • Batch classes.
  • Master data management.

Create test case for export message

Design You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.
This topic explains how to create a test case for an export message; the message source document is an internal document.

Create test case for import message

Design You can use test cases to automate the testing of an integration or data migration setup. For each message, you can create the desired test cases.
This topic explains how to create a test case for an import message; the message target document is an internal document.

Create test project

Design If you want to use automated testing, the best practice is to use separate projects for testing and for the actual integration or data migration.
Create a test project in the same way as an integration or data migration project. For the test project, only set up test tasks.

Data is written to target

Operation Based on the target document, the mapped data is written to the target of the message.

Data mapping is done

Operation Based on the record mapping and field mapping setup of the message the source data is mapped to the target records and fields.

Data validations are done in the staging journal

Operation

In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed. This is usually used to import data into D365 FO from another system. In this way, you can validate the data before it is written into the D365 FO database.

Usually, the data validations are done automatically.

Define data migration setup

General

When you have selected the AX2012 tables which data you want to migrate to D365 FO, the related data migration setup records are created. Complete the created data migration records.

Define field mapping sequence

Design

On import, the business logic in D365 FO calls the ModifiedField method. This method can set or change other values. If the field mapping sequence is not right, it can reset values which are just imported.
You can modify the sequence of the fields in the mapping. The sequence is important if you:

  • Import data.
  • Exchange data between D365 FO environments.
Tip: Set field mappings in the sequence in which you fill in the fields in the related form.

Define field sequence

Design The sequence, in which you set up the fields for the record, defines the sequence in which the related data is exchanged. If required, you can change the sequence of the fields.

Define project applications

Analysis On the project, define the applications that are involved in the project.
If you define an application for a connector, document, or type conversion, only the applications of the applicable project are available.
The project applications are also used during export and import of the project. During project:
  • Export, the project applications are exported as well.
  • Import, the applicable applications are imported before the connectors, documents, and type conversions are imported.

Define range

Design

For each record, you can define the range of data that is queried for export or import.

For example, you only want to export sales orders for a specific customer group. To do so, on the Range tab, add a record for the CustGroup field.

For more information on how to define ranges in the Range field, refer to Advanced filtering and query syntax.

Define record mapping query settings

Design

When you run a message, a query is applied. For this query, you can define several specific settings:

  • Cross company:
    If you run a message, the query is done in the company where you started the message. However, sometimes, for a specific record, it can be required to read data from another company. For example, if you run a message for a purchase order, you can read the data of a related sales order in another company.
  • Time validation:
    If some record is valid for a specific time period, by default, the time validation uses the currently valid record. You can disable the time validation. As a result, the first available record is used. Which doesn't need to be the currently valid record. For example, addresses are valid during a specified time period. If time validation is applied, the currently valid address is used. If time validation is disabled, the address is used with the first time period.

Define record mapping sequence

Design You can change the sequence of the record mappings to make sure the related data is exchanged in the right order.

Define record sequence and structure

Design

You can organize the business document records in these ways:

  • Define a parent-child relation with another record to define the record structure.
  • Change the sequence of the records.
You can only change the sequence of records if these are on the same level in the record structure. So, for example:
  • If you move a parent record, you cannot move it to a position below its child records. The child records stay as child records and are moved as well.
  • You can change the sequence of child records with the same parent.

Define sorting

Design

For internal documents and ODBC documents, you can define the order in which the data in the record is queried and processed during export or import.

For example, to export sales orders, you want the sales order to be queried by customer and for each customer by delivery date. To do so, on the Sorting tab, add a record for both the CustAccount field and the DeliveryDate field, in this sequence.

Define task dependencies

Design
 
If a task depends on one or more other tasks, define the dependencies. So, the task is not done before the other tasks are done. The dependencies are only taken into account if you run a project.
You can use task dependencies to schedule data import or export in batch. The main reasons to use task dependencies are:
  • Sequence of data: For data migrations or in complex integration scenarios, often it is required that data is imported in a specific sequence.
  • Performance: Tasks that are scheduled at the same level, can be processed in parallel. This improves the performance of the data import or export. Also, the messages, as defined for a task, are run in parallel.
Before you set up the task dependencies, define data levels for the data to be imported or exported. You can do so, for example, in Microsoft Excel.
For each data level, you can set up one or more tasks.
To define the level of a task, add a dependency to a task of the applicable previous level. As a result, the task level is automatically assigned. For example, if you add a dependency to a level '2' task, automatically, level '3' is assigned to the current task.
To each task, assign the messages that process the data for the task. You can group messages in tasks as desired.
The next picture gives an example of a data migration project. The project is run using the defined task dependencies. As a result, the data is migrated in the required sequence and with a better performance.
Example: A sales order can only be imported if the related customer and item are already available. So, the customer and item must be imported first. In the previous picture, the customer is imported in level 2 and the item in level 3. This is done before the sales order header is imported in level 4 and the sales order line in level 5.

Define transformation for mapping field

Design You can use a transformation to change a source value into another value.
If a transformation is required for a field mapping, add the relevant transformation to the field mapping.

Define type conversion for field mapping

Design You can use a type conversion to convert the data to match the format as required in the target. With a type conversion, you can convert values from any type to string or from string to any type. Usually, the string value is the external value.
If a type conversion is required for a field mapping, add the relevant type conversion to the field mapping.

Delete Connectivity studio parameters

General
If you have copied a database to the current environment and you want to manually reset the Connectivity studio parameters, first delete these parameters. As a result, all parameter settings are deleted, and a new environment ID is generated.
If the imported parameters are fine, you can also decide to only reset the Environment ID. As a result, only these fields are reset:
  • Environment ID: A new ID is generated for the environment.
  • Environment type: The environment type is reset to None.
  • Description: The description is reset to name of the current environment.

Delete document

Deployment On the Document page, you cannot delete a document if it is used on a message or if it has records and fields. However, if such a document is no longer desired, to clean up your environment, you can still delete it. On delete, only the document is deleted.
Usually, you do not clean up your documents frequently.
Beware: Cleaning up your documents results in a hard delete of the applicable documents. Note that the messages in which the document is used cannot run until another document is linked to the message. So, be very careful when you use the clean-up functionality.

Delete message

Deployment On the Message page, you cannot delete a message if it is used or if it has field mappings. A message can be used on, for example, a task, a web service action, or an EDI document flow. However, if such a message is no longer desired, to clean up your environment, you can still delete it. On delete, only the message is deleted.
Usually, you do not clean up your messages frequently.
Beware: Cleaning up your messages results in a hard delete of the applicable messages. Note that deleting a message has a consequence for the entities where it was used. For example, a web service action cannot run properly until another message is linked. So, be very careful when you use the clean-up functionality.

Delete project

Deployment On the Project page, you cannot delete a project if it is used as a sub project or if it has children, like tasks. However, if such a project is no longer desired, to clean up your environment, you can still delete it. On delete, everything that belongs to the project is deleted.
Usually, you do not clean up your projects frequently.
Beware: Cleaning up your projects results in a hard delete of the applicable projects. So, be very careful when you use the clean-up functionality.

Delete web service action

Deployment On the Web service action page, you cannot delete a web service action if it is used or if it has attributes or arguments. A web service action can be used on, for example, a task or an EDI document flow. However, if such a web service action is no longer desired, to clean up your environment, you can still delete it. On delete, only the web service action is deleted.
Usually, you do not clean up your web service actions frequently.
Beware: Cleaning up your web service actions results in a hard delete of the applicable web service actions. Note that deleting a web service action has a consequence for the entities where it was used. For example, an EDI document flow cannot run properly until another web service action is linked. So, be very careful when you use the clean-up functionality.

Develop external code to run a custom service

Design You can develop custom code outside D365 FO to directly run a custom service that is provided for Connectivity studio web services.
To use the external custom code to run a:
  • SOAP-based custom service, use this URL:
    • https://Instancename.cloudax.dynamics.com/soap/services/BisWsWebserviceOperation?wsdl
  • JSON-based custom service, use one of these URLS:
    • https://Instancename.cloudax.dynamics.com/api/services/BisWsWebserviceOperation/BisWsWebserviceCall/executeOperation. This is the basic URL.
    • https://Instancename.cloudax.dynamics.com/api/services/BisWsWebserviceOperation/BisWsWebserviceCall/executeOperationV2. Use this URL if you want to run the web service for a specific company. The 'companyId' parameter in the body parameters is only applicable if you use this URL. Make sure the D365 FO user has permission to access the specified company.

Download BIS Documentation Template and move it to Shared path

General

To analyze a project, you can create a Microsoft Word document based on the project. A summary of the project setup and related components setup is added to the document. You can use the document to review the setup.

To create the document, you must define the applicable Microsoft Word template in the Connectivity studio parameters.

To do so, first download the BIS Documentation Template.dotx template. Move the downloaded template to the folder as defined in the Connectivity studio parameters, in the Version management section, in the Shared path field.

Download transactions

Deployment When a project is exported, the transaction and created XML file are stored in the Download transactions. Download the XML file to your local download folder.

Events are logged in the Data synchronization log

Operation For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.

Export Connectivity studio parameters

General

If you want to copy a database to the current environment and you want to keep the Connectivity studio parameters, first export these parameters.

As a result, an XML file with the parameter settings is created and downloaded to your local downloads folder.

Export files for comparison

Design

To compare data, first export your data from the applicable D365 FO environment to XML files. You can use XML files to generate a content package.

To generate an XML file for a content package, run the applicable message. As a result, an XML file with the content, as defined by the message, is generated and placed in a specified folder on your Azure Storage account. You can run several messages to export different sets of data to XML files. You can generate a content package based on several XML files. In this case, make sure all desired XML files for the content package are placed in the same Azure Storage account folder.

 

Export project to file

Deployment

To exchange a configuration, you can export a project. All elements that are related to the project are exported. Transactions and history are not exported. For example, the related messages and web service actions are exported, but the message history and web service history are not exported.

Which settings are included in the export is defined by the Export options parameter in the Connectivity studio parameters. If you export a project, and the export option is:

  • All, the full project configuration is exported, including the not encrypted secrets.
  • Encrypt environment secrets, the full project configuration is exported, but the secrets are encrypted. To encrypt the secrets, you must define the Encryption key in the Connectivity studio solution parameters.
  • Skip environment, the project configuration is exported except for the environment-specific properties of connectors and webservice actions.

As a result of the export, an XML file is created with the project configuration.

This topic explains how to export a project configuration to a file.

Export released project as data entity with Data management framework

Analysis

If you have released a Connectivity studio project a project release record is created in the BisReleaseTable table that contains an XML file with the data of the project and all related components. You can export this project release as a data entity with the Data management framework.

Export secret references

Design

You can export secret references from a D365 FO environment to be imported in another D365 FO environment.

If you export a project, the secret references are not exported. If an export of the secret references is required, you can export the secret references separately.

Usually, you only export and import secret references to a D365 FO environment of the same type. For example, you only export secret references from a Development environment to import these in another Development environment. Reason: You don't want to mix up data. For example, you don't want to mix up Test data with Production data.

External web service receives request and responds to it

Operation The external web service receives the request and responds to it.

Find request message of applicable web service action

Operation If a web service action is 'subscribed' to a processed event, find the request message that is defined for the web service action. This request message is used to check if the event complies with the source document of the message.

Generate data migration message

General

You can generate messages based on the data migration setup records. If you generate a message, this is generated:

  • An ODBC document based on the source table in AX2012.
  • A D365FO document based on the target table in D365 FO.
  • A message with the:
    • Default Database connector of the data migration project as source connector.
    • Default D365FO connector of the data migration project as target connector.
    • Generated ODBC document as source document.
    • Generated D365FO document as target document.
    • Mapping of the document record and record fields that exist in both the source document and the target document.
Be aware that a generated message often needs some fine-tuning due to differences between the AX2012 table and D365FO table.

Generate tasks for data migration project

General

When you have reviewed and completed the data migration setup records with related message and documents, you can generate tasks based on the records.

The tasks are created based on the areas and sublevels as assigned to the data migration setup records in this way:
  1. A root task is created for the record which area has the lowest level and which area sublevel is the lowest as well.
  2. A task is created beneath the root task for the record with the same area and the next area sublevel. If such a record does not exist, a task is created for the record which area is the next level and which area sublevel is the lowest for that area.
  3. A task is created beneath the previous task for the record with the same area and the next area sublevel. If such a record does not exist, a task is created for the record which area is the next level and which area sublevel is the lowest for that area.
  4. Step 3 is repeated till a task is created for each record.
To each created task, the message is added as defined for the related data migration setup record. Note: If records exist with the same area and area sublevel, only one task is created based on these records. And for all these records, the messages are added to this one task.
On task generation, when a message is added to a task, and the related data migration setup record is:
  • Active, the related message action is set to 'Run'. So, if the task is run, the message is run.
  • Not active, the related message action is set to 'Skip'. So, if the task is run, the message is not run.

Example

For a data migration project, these data migration setup records are used:
Note the use of the status, areas, area sublevels, and record activation.
The generated task structure, as shown on the Project page, is:
The generated tasks with added messages, and the set message actions are:
Task Message Action
Area: 10, level: 10 CUSTGROUP Run
VENDGROUP Skip
Area: 20, level: 10 INVENTTABLE Skip
Area: 30, level: 20 VENDTABLE Run
Area: 30, level: 30 PURCHTABLE Skip
Area: 40, level: 20 CUSTTABLE Run
Area: 40, level: 30 SALESTABLE Skip

Get latest project version

Analysis If in an environment, the current project version is not the latest project version, you can get the latest project version. As a result, the current project version is replaced with the latest project version.
If you have checked out a project and made changes in the current environment, you can undo these changes by doing Get latest. In this case, also the check out of the project is undone.

Import Connectivity studio parameters

General

If you have copied a database to the current environment and you first exported the Connectivity studio parameters, you can import these parameters.

As a result, the Connectivity studio parameters are automatically reset with the original environment parameters.

Import ECS result into D365 F&O

Design

When you process comparison results, for each table, a separate XML file is created. By default, the generated XML files are stored in the Import folder. You can import the files into D365 FO to create the desired environment setup. The import message, with the ECS Azure file storage connector as source connector, gets the XML files from the Import folder.

 You can: 

  • Change the XML output folder. 

  • Move the comparison results XML file to the Import folder of another environment. Then, you can import the data in that environment. 

Import Logic App tutorial

Design

If you want to run an inbound web service or outbound web service in the cloud using Azure logic Apps, first download and extract the Project Tutorial - Logic App.zip file. This file contains a project with an inbound web service action and an outbound web service action.

Import project data entity with Data management framework

Analysis

If you have exported a project release as a data entity, you can import it in another environment with the Data management framework. As a result, a project release record is created in the BisReleaseTable table that contains an XML file with the data of the imported project and all related components.

To view the created project release, go to Connectivity studio > Periodic > Version management > Release.

Import project from file

Deployment

To exchange a configuration, you can export a project. All elements that are related to the project are exported. Transactions and history are not exported. For example, the related messages and web service actions are exported, but the message history and web service history are not exported.

As a result of the export, an XML file is created with the project configuration. You can use this XML file to import the configuration, for example, in another environment.

This topic explains how to import a project configuration from a file.

Import project from resource

Deployment

Several project configurations are included as resources in the Connectivity studio release. If the Connectivity studio is deployed, you can import these project configurations from the resources.

Import received Service Bus data

Operation

You can import data from a Service Bus queue or topic subscription.

First, you must receive the data from the Service Bus queue or topic subscription. On receiving data from the Service Bus, based on the Service Bus search definitions and settings on the received data, import messages are automatically assigned to the received data records.
When the data is received, you can import the received data from the 'Received data from queue' table into D365 FO. To import the received data, run the messages as assigned to the received data records. You can run the messages in several ways, as desired. You can run a message by running a:
  • Project
  • Task
  • Message
If an import message run finishes:
  • Successfully, the related received data record gets the status 'Finished'. If 'peek lock' is used, the Service Bus message is removed from the queue or topic subscription.
  • With errors, the related received data record gets the status 'Error'. If 'peek lock' is used, the Service Bus message is moved from the queue or topic subscription to the related dead letter queue.
Note: The assigned import messages must have the 'Service Bus queue' connector as source connector.

Import secret references

Design

You can import secret references that are exported from another D365 FO environment.

If you import a project, the secret references are not included. If the related secret references are required, you can import the secret references separately.

Usually, you only export and import secret references to a D365 FO environment of the same type. For example, you only export secret references from a Development environment to import these in another Development environment. Reason: You don't want to mix up data. For example, you don't want to mix up Test data with Production data.

Inbound web service action is triggered

Operation
The inbound web service application triggers the applicable inbound web service action.
Based on these parameters in the HTTP request, the inbound web service calls the executeWebserviceOperation method that determines which web service action is triggered:
  • Project (ProjectId): Does a Connectivity studio project exist in D365 FO with this ID?
  • Web service action (WebServiceId): Does an inbound web service action exist with this ID?
  • HTTP action (HttpMethod): What is the goal of the HTTP request? Does this goal match with the HTTP action as defined for the web service action?
  • User name (UserName): Is this user set up as web service user? And is this user allowed to run the web service action?
  • URL (Url): If REST, does the URL contain arguments? And what are the values of these arguments?
  • Content (content):
    • For both the web service types (Basic or REST), if HTTP action is Post, Put, Delete, Patch, or Post or Put, the content contains the values to be processed.
    • If the web service type is SOAP and the HTTP action is Get, the content contains the arguments of type parameter. What are the values of these arguments?
  • Company ID (companyId): Is the web service action and web service user combination valid for this company?

Inbound web service application receives HTTP request

Operation The inbound web service application receives an HTTP request from the external application.

Inbound web service receives default response text and sends it to external application

Operation the inbound web service receives a default response text and sends it to the external application.

Inbound web service receives response and sends it to external application

Operation The inbound web service receives a response and sends it to the external application.

Inbound web service staging table records are processed in batch

Operation

Initialize document record fields

Design

For documents of type Text or Microsoft Excel, you can initialize the fields for a record. To initialize record fields for:

  • A Text document, use a connector of type Azure file storage and an input file of type TXT. The fields are initialized based on the header-line setup in the TXT file.
  • A Microsoft Excel document, use a connector of type Azure file storage and an input file of type XLSX. The fields are initialized based on the header-line setup in the XLSX file.

When the initialization is finished, review and complete the properties of the initialized fields. Usually, during review, you do not add fields. However, you can remove the not needed fields.

Initialize document record fields - ODBC

Design

For documents of type ODBC, you can initialize the fields for a record. To initialize record fields for an ODBC document, use a connector of type Database to connect to the applicable external database.

Make sure, the name in the Record table field is exactly the same name as the relevant table in the external database.
The fields are added to the record based on the fields of the external table.
When the initialization is finished, review and complete the properties of the initialized fields. Usually, during review, you do not add fields. However, you can remove the not needed fields.

Initialize document records and fields

Design
For documents of type XML or JSON, you can initialize the records based on an input file. 
To initialize records and fields for:
  • An XML document, you can use an input file of type XML or XSD.
  • A JSON document, use an input file of type JSON.
On initializing records, also the fields that are defined in the input file, are initialized. If you initialize based on:
  • An XML file, only the field names are known and added to the relevant document records.
  • An XSD file, besides the field names, also the field properties can be defined and added to the relevant document records.
  • A JSON file, only the field names are known and added to the relevant document records.
When the initialization is finished, review and complete the properties of the initialized records and fields.
Usually, during review, you do not add records or fields. However, you can remove the not needed record and fields.

Initialize form mapping from message mapping

Design

You can initialize a form mapping from an existing message. As a result, the record mapping and field mapping of the message are translated to a form mapping.

You can do this, for example, to bridge the gap between functional users and technical users.

Initialize message mapping from form mapping

Design If you have finished the form mapping recording and reviewed it, you can use this to initialize the related message mapping.
As a result of the initialization, based on the form mapping:
  • The recorded records and fields are added to the internal document of the message.
  • The message record mapping and field mapping are set up.
After initialization, you can edit and fine-tune the internal document and the message.

Initialize project in environment

Analysis
The main purpose of 'Get latest' on the Project version management page is to initialize a version-controlled project in the current environment. So, if a project is not available in the current environment, you can select it from the list and make it available.
Besides this, you can also use 'Get latest' in these cases:

  • If in an environment, the current project version is not the latest project version, you can get the latest project version. As a result, the current project version is replaced with the latest project version.
  • If you have checked out a project and made changes in the current environment, you can undo these changes by doing Get latest. In this case, also the check out of the project is undone.

Install Azure Storage Explorer and connect to an Azure Storage Account

Design
You can use the Azure Storage Explorer to connect to and manage your Azure Storage Accounts.
First download and install the Azure Storage Explorer and then connect to a storage account.
To connect, use a storage account name and key. You can find the name and key on the Azure Portal > Storage accounts > Access keys.
When connected, and you use the Azure Storage Explorer for:
  • Azure file storage connectors: You can create the required file shares.
  • Blob storage connectors: You can create the required Blob containers.
  • Version management: You can create the required version management file share.
  • History reports: You can create the required history report file share.
For more information, refer to Get started with Storage Explorer.

Install deployment tool

Design

Run the IIS application deployment tool to install the web site.

Before you can run the deployment tool, download the BisMessageHttpActionD365FO.zip file.

Install on-premises Windows service - BIS Azure Service Bus

Design

Install the BIS Azure Service Bus as Windows service on the external on-premises server.

To install the BIS Azure Service Bus, first download the BIS Azure Service Bus.zip file.

Install XPO on AX2012

Design

In D365 FO, enums can be extendable. For extendable enums, the enum value numbering is not fixed. Therefore, an enum value number in AX2012 can be different from the number of the related enum value in D365 FO.

To prevent a mismatch of enum values when migrating data, migrate enums based on the enum value name instead of the enum value number. To do so, on the AX2012 environment:
  1. Download and extract the SharedProject_Ax2012PrepMigrateD365FO ZIP file.
  2. Install the SharedProject_Ax2012PrepMigrateD365FO.xpo file. As a result, these enum transformation tables are installed in AX2012:
    • BisDmMigrationtableFieldEnum
    • BisDmMigrationenumValue
  3. Run the BisDmMigrationCreateEnumValues job.
As a result, you can migrate the enum values based on the names instead of the numbers.

Manually fix errors

Design If you have re-run the automated error check, and still errors exist in the setup that are not fixed automatically, manually fix these errors.

Mark message run as Solved

Operation If the errors of a message run are solved, and no re-run in History management is required, the message run status is not automatically set to Finished. Therefore, you can mark the message run as Solved. As a result, the message run status is set to Finished.

Merge message

Design

You can troubleshooted a failing message using a copy of that message. When you have made changes to the message copy to solve the issue, the same changes must be applied to the original message. To do so, you can merge the message copy with the original message.

When you merge a message (source) with another message (target), the:

  • Target message settings are overwritten with the source message settings.
  • Target message documents settings are overwritten with the source message documents settings.

Note: Use this merge option carefully. You cannot undo the overwrite action.

Monitor data received from Service Bus dead letter queue

Operation

You can monitor the data that you receive from an Azure Service Bus dead letter queue.
Use the Received status to decide if any troubleshooting action is required.

The possible statuses are:
Status Description
New The data is received from the Service Bus dead letter queue. Based on the Service Bus search definitions, a message is assigned.
In process The data is now imported into D365 FO by the assigned message.
Finished The data import into D365 FO is finished successfully.
Error The data import into D365 FO is finished with errors.
Invalid The data is received from the Service Bus dead letter queue. But, based on the Service Bus search definitions, no message is assigned.

Monitor data received from Service Bus queue or topic subscription

Operation
You can monitor the data that you receive from an Azure Service Bus queue or topic subscription.
Use the Received status to decide if any troubleshooting action is required.
The possible statuses are:
Status Description
New The data is received from the Service Bus queue or topic subscription. Based on the Service Bus search definitions, a message is assigned.
In process The data is now imported into D365 FO by the assigned message.
Finished The data import into D365 FO is finished successfully.
Error The data import into D365 FO is finished with errors.
Invalid The data is received from the Service Bus queue or topic subscription. But, based on the Service Bus search definitions, no message is assigned.

Monitor data sent to Service Bus

Operation

On export of data to an Azure Service Bus queue or topic a message is added to the Service Bus queue. Each message that is added to the Service Bus queue is logged in Connectivity studio.

You can monitor this data, for example, for troubleshooting purposes.
Note:
By default, only records with errors are shown. If on receipt of the Service Bus message an error occurs, the Service Bus moves the message to the related dead letter queue. If you get the dead letter queue data from the Service Bus, and the message exists in the 'Data sent to queue' table, its received status is set to 'Returned with error'.

Monitor data synchronization log

Operation

For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.
This topic explains how to monitor the data synchronization log.

If table events are not logged in the Data synchronization log as expected, you can check the data synchronization setup for your message or outbound web service action.

Monitor inbound web service staging table

Design

Depending on the asynchronous execution mode of the inbound web service action, the inbound web service process runs directly or asynchronously.
If the asynchronous execution mode is 'Batch' and the web service action is triggered, the data, as received by the web service request, is stored in the Inbound web service staging table.

You can monitor the inbound web service staging table.

Monitor project history

Operation

The project history shows, the project runs of the selected project. And for each project run, for example, its status is shown and if the project tasks have run with errors.

Monitor project versions

Analysis To monitor project versions, you can open a Project version management page that shows:
  • All projects.
  • For each project, the latest version and the version in the current environment.
  • For each selected project, the project version history log.

Monitor task history

Operation

Usually, you run a task by running the related project. However, you can also run a single task, for example, for testing purposes.
When a task is run, you can view the task history. The task history shows, for example, the status of the task run and if the related messages have run with errors.

Monitor web service action history

Operation When a web service action is run, you can view the web service history. The web service history shows, for example, the:
  • Status of the web service action run.
  • Related web service requests.
  • Related message runs.
Possible issues can occur, for example, in the:
  • Message run.
  • Connection with the web service.

Outbound queue is processed

Operation

For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the data synchronization log. On processing the data synchronization log, based on the logged events, records are added to the outbound queue. On processing the outbound queue, for each record, the related message or web service action is run to export the applicable data.

Process data synchronization log

Operation

For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.To fully process the logged events, process the:

  1. Data synchronization log: The logged events are processed and for each applicable message or web service action, a record is added to the Outbound queue.
  2. Outbound queue: For each record, the related message or web service action is run to export the applicable data.

This topic explains how to process the data synchronization log.

 


Note:

  • Events are logged only for tables and event types that are defined in the data synchronization setup of any message or web service action.
  • If on the data synchronization setup, the Update check box and the Check used fields check box are selected, event logging is restricted to changes to the fields as defined for the record in the applicable source document.
  • Only the event is logged and used for further processing. So, in case of an update, an event is logged of type Update. However, the changed data is not marked as changed and not used for further processing. For example, if you change a customer address, an event is logged that the customer is updated. However, the address change as such is not logged. When, for the logged event, a record is processed in the outbound queue, the then current data of the customer is used.
  • If a logged event is processed, it is deleted from the data synchronization log.

Example:

This table gives examples of the results for several data synchronization log processing scenarios:

Number of pages Page size Limit records Number of records to be processed Result
5 10,000 No 40,000 4 pages of 10.000 records are processed
5 10,000 Yes 40,000 4 pages of 10.000 records are processed
5 10,000 No 80,000 5 pages of 16.000 records are processed
5 10,000 Yes 80,000 5 pages of 10.000 records are processed

 

Process outbound queue

Operation
For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.
To fully process the logged events, process the:
  1. Data synchronization log: The logged events are processed and for each applicable message or web service action, a record is added to the Outbound queue.
  2. Outbound queue: For each record, the related message or web service action is run to export the applicable data.

This topic explains how to process the outbound queue.

 

 

Outbound queue
The outbound queue only has unique entries for each combination of:
  • Record on which the event was logged
  • Message or web service action
  • Event type
  • Status
Note that the record and event type are known, but the (changed) data of the records is not known. When an outbound queue entry is processed, the current data of the record is used. For example, if a customer address is changed, an event is logged that the customer is updated. However, the address change as such is not logged. When, for the logged event, a record is processed in the outbound queue, the current data of the customer is used.
Bundling

In the outbound queue, several entries can exist for a unique message or webservice action. How these entries are processed, depends on whether bundling is active. You can activate bundling on the data synchronization setup of an outbound:

  • Message: On processing the outbound queue, by default, all entries for a unique message are bundled in one message run, which results in one file. 
  • Web service action: On processing the outbound queue, by default, all entries for a unique web service action are bundled in one web service action run, which results in one file. Note: You cannot apply bundling to an outbound web service action that uses a stream.
  • EDI document flow: On processing the outbound queue, the entries are bundled in one message run or web service action run, which results in one file. In this case, the outbound queue entries are bundled by the applicable EDI party.

If the bundling field is set to:

  • Yes, all outbound queue entries for the message are bundled and processed in one message run, which results in one file.
  • No, each outbound queue entry is processed in a separate message run.
Processing the outbound queue
You can process the outbound queue in several ways. To process the outbound queue for:
  • All records, do not define a message or web service action.
  • A specific message, define the message.
  • A specific web service action, define the web service action.
You can also run the outbound queue for a specific project. To do so, when you run a project, set the Process outbound queue field to Yes. As a result, the outbound queue is processed for the messages or web service actions that are linked to the project tasks.
If an outbound queue record is processed, it gets the status:
  • Processed, if it is processed successfully.
  • Error, if errors occurred during processing.

Process received dead letter data

Operation

If a receiver reads a message from a Service Bus queue or topic subscription, an error can occur. If so, the Service Bus moves the message to the applicable dead letter queue.

You can receive the Service Bus dead letter queue data. When received, you can further process this data into D365 FO. For example, you can set a Production order 'On hold' when a dead letter is received on the export of a production order.

To further process the received dead letter data, run the messages as assigned to the received dead letter data records. You can run the messages in several ways, as desired. You can run a message by running a:

  • Project
  • Task
  • Message
Note: The assigned process messages must have the 'Service Bus queue' connector as source connector.

Process using source

Design

When you analyze data differences, you can edit a record value manually. To process your analysis results, you can choose to use the source value. As a result, in the created XML file, for the fields to be edited manually, the source value is used as field value. You can manually edit the value in the resulting XML file.

Process using target

Design

When you analyze data differences, you can edit a record value manually. To process your analysis results, you can choose to use the target value. As a result, in the created XML file, for the fields to be edited manually, the target value is used as field value. You can manually edit the value in the resulting XML file.

Publish imported project release

Analysis If you have imported a project release as a data entity, you can publish the project release in the current environment. As a result, the project and all related components are created based on the data in the XML file of the project release.

Reassign message to data received from Service Bus dead letter queue

Operation If no message is assigned to data that you received from a Service Bus dead letter queue (Received status = Invalid), you must adjust or add Service Bus search definitions for dead letter data. If you have done so, you can retry to assign messages to the received dead letter data records without a message assigned.
On reassign, for all received dead letter data records without a message, a matching search definition is searched for. If a matching search definition is found:
  • The message and company, as defined for the search definition, are assigned to the received dead letter data record.
  • The received dead letter data record status is set to 'New'.
You can reassign messages:
  • Automatically
  • Manually

Reassign message to data received from Service Bus queue or topic subscription

Operation If no message is assigned to data that you received from a Service Bus queue or topic subscription (Received status = Invalid), you must adjust or add Service Bus search definitions. If you have done so, you can retry to assign messages to the received data records without a message assigned.
On reassign, for all received data records without a message, a matching search definition is searched for. If a matching search definition is found:
  • The message and company, as defined for the search definition, are assigned to the received data record.
  • The received data record status is set to 'New'.
You can reassign messages:
  • Automatically
  • Manually

Receive data from Service Bus dead letter queue

Operation

If a receiver reads a message from a Service Bus queue or topic subscription, an error can occur. If so, the Service Bus moves the message to the applicable dead letter queue.

You can receive the messages from the Service Bus dead letter queue.

On receiving data from the Service Bus dead letter queue, based on the Service Bus search definitions and settings on the received data, messages are automatically assigned to the received dead letter data records. The assigned messages are used to further process the dead letter data into D365 FO. For example, you can set a Production order 'On hold' when a dead letter is received on the export of a production order.

Receive data from Service Bus queue or topic subscription

Operation You can use the 'Service Bus queue' connector to import data from a Service Bus queue.
To import data, first receive the data from a Service Bus queue or topic subscription. The received data is added to the 'Received data from queue' table in Connectivity studio.
On receiving data from the Service Bus:
  • Based on the Service Bus search definitions and settings on the received data, import messages are automatically assigned to the received data records.
  • You can use 'peek lock'. If you use 'peek lock', the received Service Bus messages are temporarily locked in the Service Bus queue or topic subscription.

Record form mapping

Design
For a message, you can record the record mapping and field mapping based on an external document. When recording, you map fields of the external document to form fields in D365 FO.
Before you record a form mapping, set up a message with both an:
  • External document. This document must be fully set up, with all applicable records and fields defined.
  • Internal document. For this document, only the document header is required. So, it does not have records and record fields defined.

Record is created in the Inbound web service staging table

Operation

Register app in Microsoft Entra ID to connect to Azure Storage

Design

If you want to connect to the Azure Blob storage with an authentication of type 'Password' or 'Client credentials', register an app in Microsoft Entra ID and configure the Azure Storage permissions.

For more information, refer to:

Register application with Microsoft Entra ID

Design

Register a native web application in Microsoft Entra ID to access D365 FO. For more information, refer to Register an application with the Microsoft identity platform.

For the web app, only OAuth 2.0 authentication is supported. So, on the registered app, as authentication, add a platform of type Mobile and desktop applications, and choose an option that supports OAuth2.
Make sure the required API permissions to the Dynamics ERP API are granted for the Microsoft Entra ID application. For more information, refer to Permissions and consent in the Microsoft identity platform.

Register Azure Logic App in Microsoft Entra ID

Design

To be able to fill in the Client ID and the Secret in the HTTP action settings of the Azure Logic App, register the Azure Logic App in Microsoft Entra ID. For more information, refer to: Register an app.

Register Microsoft Entra ID application in D365 FO

Design

To connect to D365 FO, you must register the Microsoft Entra ID application in D365 FO.

Release project

Analysis
You can release a project to enable the export of the project as a data file. Always, the latest version of the project is released.
As a result, a project release record is created in the BisReleaseTable table that contains an XML file with the data of the project and all related components. To view the project release, go to Connectivity studio > Periodic > Version management > Release.

Repair header info in project version files

Analysis

In unusual cases, the header information can get disturbed in the files on the file storage. You can repair the header information in the files on the file storage, based on the project version management table.

Repair project version check-in status on file storage

Analysis In unusual cases, the check-in/out status can get disturbed in the file storage. You can repair the status in the file storage, based on the project version management table.

Re-run file actions

Operation

On a message, you can use a connector of type Azure file storage. For a Azure file storage connector, you can set up several file actions. These file actions are run in the defined sequence. If one of the file actions fails, the next file actions are not run.

After you have solved the errors that caused the file action to fail, you can re-run the file actions of the message.

Re-run message run

Operation

If you have solved the errors in a message run, you can run it again. The message is re-run in the company in which it was originally run. As a result:

  • The status of the selected message run is set to Reprocessed.
  • A new message run history record is created for the re-run.

Reset outbound queue record status to New

Operation
When the outbound queue is processed, errors can occur. You can reset the outbound queue record status to New.
For example, when running a web service action, the external web service can be down. As a consequence, running the related outbound queue records fails and these get the status Error. When the issue is solved, you can reset the status to New, And the next time the outbound queue is processed, these records are processed again.
This topic explains how to reset the status of outbound queue records to New.

Reset status of data received from Service Bus queue or topic subscription

Operation
To import data received from a Service Bus queue or topic subscription, you run import messages to read data from the 'Received data' table and import this data into D365 FO. If an import message run finishes with errors, the related received data record gets the status 'Error'. When you have finished troubleshooting the error, you can reset the received data record status.
You can reset the status:
  • At once for all received data records with the same message assigned. You can, for example do so if you have troubleshooted a message for which you have received several data records.
  • Manually for a specific received data record.
You can, for example, reset the status to:
  • 'New' if you want to run the import message again for the received data records.
  • 'Finished' if no import of the received data record is required.

Reset status of date received from Service Bus dead letter queue

Operation To import data received from a Service Bus dead letter queue, you run import messages to read data from the 'Received dead letter data' table and import this data into D365 FO. If an import message run finishes with errors, the related received dead letter data record gets the status 'Error'. When you have finished troubleshooting the error, you can reset the received dead letter data record status.
You can reset the status:
  • At once for all received dead letter data records with the same message assigned. You can, for example do so if you have troubleshooted a message for which you have received several dead letter data records.
  • Manually for a specific received dead letter data record.
You can, for example, reset the status to:
  • 'New' if you want to run the import message again for the received dead letter data records.
  • 'Finished' if no import of the received dead letter data record is required.

Restore project version

Analysis

In an environment, you can restore another project version. As a result, the current project version is replaced with the selected project version.

You can restore a project version for these reasons:

  • To temporarily use it, for example, for reviewing or testing purposes or if the latest version has issues. In this case, do not check out the project when you restore it.
  • To make it the basis for further design. In this case, check out the project when you restore it. When you check in the project, the restored version becomes the latest version.

Review and complete data migration setup

General
As a result of the message generation, this is generated:
  • An ODBC document based on the source table in AX2012.
  • A D365FO document based on the target table in D365 FO.
  • A message with the:
    • Default Database connector of the data migration project as source connector.
    • Default D365FO connector of the data migration project as target connector.
    • Generated ODBC document as source document.
    • Generated D365FO document as target document.
    • Mapping of the document record and record fields that exist in both the source document and the target document.
  • The message is added to the data migration setup record.
For each data migration setup record, review and complete the generated documents and message.

Review form mapping

Design Once the form mapping recording is finished, you can review the recorded form mapping.

Run inbound web service staging records

Operation Depending on the asynchronous execution mode of the inbound web service action, the inbound web service process runs directly or asynchronously.
If the asynchronous execution mode is 'Batch' and the web service action is triggered, the data, as received by the web service request, is stored in the Inbound web service staging table.
These inbound web service staging table records must be processed in batch. When processed, the applicable request message is run. Usually, no response message and error message are defined an asynchronous inbound web service process.

Run message

Operation
You can run a message directly. You can do so, for example, for testing purposes or to run a specific message in batch.

Run message for testing purposes

Design

You can run a message for testing and troubleshooting purposes. You can find and analyze the results in the:

  • Message history.
  • File history if the message has an Azure file storage connector with file actions.
  • Tracer if you have used the tracer on the message run.
Note: If the message setup is validated, a message run can still result in errors or give an unexpected result. In this case, you can choose to store all processed records in the message history for analysis purposes. To do so, before the test run, on the message header, set the Store history field to 'Yes'. When the issue is solved, do not forget to set the Store history field back to 'No'.

Run message from action menu item

Operation You can use a custom action menu item to manually start a message from a form. As a result, for a selected record, you can manually run the message from the form to which you added the action menu item. The message is run for the selected record only.
Example: Create an action menu item and add it to the Sales orders form. On the Sales orders form, you can select a sales order and click the action menu item button. This starts a message to process the selected sales order.

Run message or web service action from dynamic button

Operation

You can add a dynamic button to a form to run a message or an outbound web service action from the form.
To add a dynamic button, no coding is required.
The dynamic button is added to the Business integration tab of the ActionPane of the form.

Run message test cases

Design

To check if all test cases of a message work properly, you can run the message test cases. So, no full message run is done. Only the test cases, as defined for the message, are run.

Run project

Operation You can use a project to exchange data. To be able to run a project, tasks must be set up for the project. If you run the project, all related tasks are run.
Depending on the tasks that are set up for the project, you can use a project to run:
  • An integration or data migration.
  • Outbound web services.
  • Batch classes.
  • Master data management.
  • Test cases.
Usually, you run a project in batch. Based on the defined sub-projects and tasks dependencies, tasks are run in parallel. If you do not run the project in batch, all sub-projects and tasks are run sequentially.

Run task

Operation
You can use a task to run the execution of:
  • An integration or data migration. In this case, use a task with the desired messages defined.
  • Outbound web services. In this case, use a task with the desired outbound web service actions defined.
  • Batch classes. In this case, use a task with the desired batch classes defined.
  • Master data management. In this case, use a task with the desired master data entities defined.
  • Test cases. In this case, use a task with the desired messages with test cases defined.
If you run a task, all defined messages, outbound web service actions, batch classes, and master data entities are run.
Usually, you run a task by running the related project. In this case, all tasks of the project are run, taking into account the defined task dependencies. However, you can run a single task, for example, for testing purposes.

Run test case

Design To check if a test case works properly, you can run the test case.

Run web service action

Operation

If you want to run an outbound web service action immediately, you can run it from the Web service action page.

Run web service action from action menu item

Operation You can use a custom action menu item to manually start an outbound web service action from a form. As a result, for a selected record, you can manually run the web service action from the form to which you added the action menu item.
Example: Create an action menu item and add it to the Sales orders form. On the Sales orders form, you can select a record and click the action menu item button. This starts a web service action to post a sales order. The web service action is processed, and the selected sales order is posted.

Search for applicable messages and web service actions

Operation For each processed event, the applicable messages and web service actions are searched for.
The event is only further processed for the messages and web service actions that are 'subscribed' to the table and event type.
Example:
A message has a data synchronization setup for table events. One of the records is Customer (CustTable) for which only Update is selected. If an event is logged that a:
  • New customer is created (Insert), the message is not subscribed to this event. As a result, for the message, this event is not further processed.
  • Change is made to an existing customer (Update), the message is subscribed to this event. As a result, for the message, this event is taken for further processing.

Select fields

Design

You can add a selection of table fields to a record. You can select fields from the D365 FO table that is defined in the Record table field.

This is mainly applicable to internal documents. However, you can also use this to quickly set up fields for external file-based documents.

When the field selection is added to the record, review and complete the properties of the added fields.

Select fields - ODBC

Design

You can add a selection of table fields to a record.

You can select fields from an external table via ODBC. To connect to the external environment, the default connector of type Database. You can set up this default connector for the applicable project.

To find the external table name, the name that is defined in the Record table field is used.

When the field selection is added to the record, review and complete the properties of the added fields.

Select tables to be mapped

General

For a data migration from AX2021 to D365 FO, you can generate messages based on data migration setup records. Before you can do so, select the AX2012 tables which data you want to migrate to D365 FO. As a result, the related data migration setup records are created with the selected AX2012 tables as source table. If in D365 FO a table exists with the same name, this is automatically set as target table.

To select the tables from the AX2021 database, the default Database connector, as defined for the data migration project, is used.

Select the content packages for comparison

Design

To compare data, select two content packages. If a desired content package does not exist, import the content package. As a result, a comparison is created.

Set Connectivity studio parameters

General

Before you start using Connectivity studio, set up the Connectivity studio parameters.

Define parameters for the:

  • Applicable environment.
  • File storage that is used for the environment.
  • Version management of Connectivity studio projects.
  • Connectivity studio history reporting.
  • Data synchronization log processing.
  • ODBC Service bus.
  • Outbound web services queue.

Set Data quality studio integration options

Design

You can apply data quality policy rules on import of data into D365 FO with a Connectivity studio message. On the message header, define which types of rules are applied on data import with the message.

For more information, refer to Apply data quality rules on data import with Connectivity studio.

Set up applications

Analysis

Define the applications that are involved in integration or data migration projects. You can link an application to several projects.

Examples of applications are:
  • 'D365 FO' to indicate your D365 FO application.
  • 'Staging' to indicate that a staging journal is involved in the integration.
  • 'Windows folder' or 'Files' to indicate that the integration is file-based.
  • 'ERP' to indicate the integration is with another ERP application. You can use the external ERP application name as application name.
For each connector and document, you must define the applicable application. You can also define an application for a type conversion.

External ID and revision number

Applications are also used to store the external IDs and external revision numbers that are used in the external application. You can link an external ID to a record ID in D365 FO. Together with the external ID, you can also link an external revision number to a record ID. The application, as defined for the applicable connector, is used to store the link between the reference table and record ID in D365 FO and the external ID and revision number.
If you:
  • Export data, the applicable external ID or revision is searched for in the target connector application. If a record exists for the reference table and the source field (RecId) combination, the external ID or revision of this record is the output value of this option.
  • Import data, the applicable target record ID is searched for in the source connector application. If a record exists for the combination of reference table and external ID and revision, the reference record ID of this record is the output value of this option. If no record exists for the combination, a record is created. The reference record ID of this record is the output value of this option.

Set up Azure file storage connector

Design

Set up a connector of type Azure file storage. Use this type to exchange data files between your D365 FO environment (on-cloud or on-premises) and another environment, for example an on-premises environment.

With the Azure file storage type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON.

You can exchange data files using one of these file systems:

File system Description
Azure File Storage You can use an Azure Storage Account to exchange data files between your D365 FO environment (on-cloud or on-premises) and another environment, for example an on-premises environment.
Local folders If you use Connectivity studio on a D365 FO (on-premises) environment, you can choose to use local Windows folders to exchange data files.

 

Set up Azure Service Bus namespace - Queue

Design

You can use a connector of type 'Service Bus queue' to exchange information via an Azure Service bus queue. A Service Bus queue provides First In, First Out (FIFO) message delivery to one or more competing consumers. That is, receivers typically receive and process messages in the order in which they were added to the queue. And only one message consumer receives and processes each message.

To set up an Azure Service Bus queue:
  • Create a namespace in the Azure portal
    You need the Service Bus namespace name to fill in the Service Bus namespace field of the connector.
  • Create a queue in the Azure portal
    You need the Service Bus queue name to fill in the Entity name field of the connector.
  • Create Shared access signature (SAS) policy
    You can create and use a SAS policy on these levels: Service Bus queue or Service Bus namespace. You need the:
    • SAS policy name to fill in the Policy name field of the connector.
    • Primary key of the SAS policy to fill in the Policy key field of the connector.
For more information, refer to:

Set up Azure Service Bus namespace - Topic and subscriptions

Design
You can use a connector of type 'Service Bus queue' to exchange information via an Azure Service bus topic and subscriptions. A topic and subscriptions provide a one-to-many form of communication in a publish and subscribe pattern. It's useful for scaling to large numbers of recipients. Each published message is made available to each subscription registered with the topic. Publisher sends a message to a topic and one or more subscribers receive a copy of the message, depending on filter rules set on these subscriptions.
To set up an Azure Service Bus queue:
  • Create a namespace in the Azure portal
    You need the Service Bus namespace name to fill in the Service Bus namespace field of the connector.
  • Create a topic in the Azure portal
    You need the Service Bus topic name to fill in the Entity name field of the connector.
  • Create subscriptions to the topic
    You need the Service Bus subscription name to fill in the Subscription field of the connector.
  • Create Shared access signature (SAS) policy
    You can create and use a SAS policy on these levels: Service Bus topic (advised) or Service Bus namespace. You need the:
    • SAS policy name to fill in the Policy name field of the connector.
    • Primary key of the SAS policy to fill in the Policy key field of the connector.
For more information, refer to:

Set up Azure Web App

Design

Set up the Azure App Service Web App that is used to manage the web service. For more information, refer to App Service documentation .

For the binaries that are required to install the Azure Web App, contact STAEDEAN Support. They can provide you with a ZIP file that also contains the installation instructions.

Set up Blob storage connector

Design

Set up a connector of type Blob storage to exchange data files between your D365 FO environment and another environment, using Azure Blob storage. Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data.

With the Blob storage type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.

You can use one of these authentication methods to access the Blob storage:

 

  • Shared access key: Grants access to a storage account.
  • Shared access signature (SAS): Grants access to a specific Blob container.
  • Client credentials: Grants access to Blob storage via an Microsoft Entra ID application with Azure storage permission.
  • Password: Grants access to Blob storage via an Microsoft Entra ID application with Azure storage permission.
For more information, refer to:

 

Set up company-specific field mapping

Design You can set up company-specific mapping on field mapping level. 
You can use company-specific mapping to set a field value:
  • Other than the source value for the field. To do so, define a constant, dimension, or number sequence.
  • If no source value is available for the field. To do so, define the default value.
You can view the applicable company-specific mapping for a:
  • Message: On the ActionPane, on the Design tab, in the Message mapping group, click Company-specific mapping. All company-specific mappings for the message are shown.
  • Record mapping: In the Mapping section, select a record mapping, and click Company-specific mapping. All company-specific mappings for the selected record mapping are shown.

Set up D365 FO connector

Design

Set up a connector of type D365 FO. Use this type to directly connect to a D365 FO database.

Set up data migration areas

General Set up data migration areas. Use data migration areas to:
  • Group data migration setup records, for example, by functional area.
  • Assign a level to data migration setup records.
Areas are used to generate tasks and task dependencies for the data migration project.

Set up data migration statuses

General You can set up statuses that you can apply to the data migration setup records. 
Using statuses is not mandatory. Statuses are useful to monitor the data migration setup.

Set up data synchronization - Date range - Message

Design
On export of data, the data synchronization setup defines which records are processed.
Data synchronization only applies to messages with an internal document as source document.
This topic explains how to use a date range to export only the records that are changed or added since the latest message run.
Make sure, the root record of the source document has a date/time field that indicates when the record is last changed, for example a 'modifiedDateTime' field. If you run the message, all records that are found, based on the source document setup, are considered. For each found root record, the date/time is compared with the Latest run date/time of the message. Only the records are exported with a date/time that is later than the latest run date/time of the message.

Set up data synchronization - Date range - Web service

Design

For outbound web service actions, you can use the data synchronization setup to define which records are processed.

Web service data synchronization only applies to outbound web service actions for which data synchronization is set up. If no data synchronization is set up, the web service action must be run in another way.
This topic explains how to use a date range to export only the records that are changed or added since the latest web service action run.
Make sure, the root record of the source document of the request message has a date/time field that indicates when the record is last changed, for example a 'modifiedDateTime' field. If you run the outbound web service action, all records that are found, based on the source document setup, are considered. For each found root record, the date/time is compared with the Latest run date/time of the outbound web service action. Only the records are exported with a date/time that is later than the latest run date/time of the outbound web service action.

Set up data synchronization - Table events - Message

Design
On export of data, the data synchronization setup defines which records are processed.
Data synchronization only applies to messages with an internal document as source document.
This topic explains how to use table events to track data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log. From here, the logged entries must be processed further to the Outbound message queue. Process the outbound message queue to run the messages to export the records.

Set up data synchronization - Table events - Outbound web service action

Design
For outbound web service actions, you can use the data synchronization setup to define which records are processed.
Web service data synchronization only applies to outbound web service actions for which data synchronization is set up. If no data synchronization is set up, the web service action must be run in another way.
This topic explains how to use table events to track data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log. From here, the logged entries must be processed further to the Outbound message queue. Process the outbound message queue to run the web service actions to export the records.

Set up Database connector

Design

Set up a connector of type Database. Use this type to directly connect to an external database. This external database can be an on-premises database or a cloud database.

You can connect to an external database with an:

  • ODBC connection: Connects to an external database via ODBC, using an Azure Service Bus.
  • SQL connection: Connects to an external Azure SQL database.

Set up default connector for project

Analysis
You can define the default connectors for the project. You can define only one default connector for each connector type.
A default connector is applied in these cases:
  • If you create a message with the current project, the first entered default connector to be initialized as:
    • Source is used as source connector.
    • Target is used as target connector.
  • If you test a document. In this case, the applicable default connector depends on the document type. If the document is an:
    • External file-based document, the default Azure file storage connector is applied.
    • External ODBC document, the default Database connector is applied.
    • Internal document, the default D365 FO connector is applied.
  • If you add records to an ODBC document, the default Database connector is applied. If the Database connection is active, you can select a table from the external database.
  • If you initialize record fields for an ODBC document, the default Database connector is applied. If the Database connection is active, the record fields are initialized based on the record table.
  • If you add record fields to an ODBC document, the default Database connector is applied. If the Database connection is active, you can select a field from the external database.

Set up default connectors for data migration project

General
You can migrate data from an AX2012 database to D365FO. To generate data migration messages based on selected tables, default connectors are required on the project.
Define these default connectors for your data migration project:
  • Database connector to connect to the AX2012 database.
  • D365FO connector to connect to the current D365FO environment.
Make sure, the right application is defined for these connectors.

Set up default response text

Design If you run the inbound web service process asynchronously, define the default response text that is sent back to the web service when the web service action is triggered.
You run an inbound web service process asynchronously if, for the inbound web service action, the Run asynchronously field has one of these values:
  • Asynchronous
  • Batch
Example of a default response text:
{
"response": "We have received the request."
}

Set up document - D365 FO

Design Use a D365 FO document to read data from or write data to D365 FO.

Set up document - EDI

Design Use an EDI document to read data from or write data to a file if you use an EDI standard for your EDI. For example: EDIFACT or ANSI X12. The data in the file is structured in line with these standards.
Note: If you do not use an EDI standard, you can use other EDI document types for EDI. For example: JSON, XML, or Text.

Set up document - Fixed text

Design Use a Fixed text document to read data from or write data to a file with defined start positions and lengths for fields.

Set up document - Inventory journal

Design
Use an Inventory journal document to import data into the inventory journals.
Compared to the D365 FO document, the Inventory journal document focuses only on the inventory journals. It has journal-specific properties and behavior.
Based on the document type and journal name, the journal header is created automatically when the message is started, and the document is opened. So, in the document:
  • No records are required for the journal header. Also, no mapping is required in the related message.
  • Only set up records for the journal line tables. All lines in the same import message run are added to the created header. For the records, use tables that are related to the defined journal name. For example, the InventJournalTrans table in combination with the InventDim table.
The journal name, as defined in the document header properties, defined which inventory journal is applicable. Examples of supported inventory journals are:
  • Movement journal
  • Inventory BOM journal
  • Counting journal
  • Transfer journal
  • Project item journal

Set up document - JSON

Design Use a JSON document to exchange data with an application that uses data-object notation. 
With a document of type JSON, you can read data from or write data to a JSON file.
For more information  JSON, refer to Introducing JSON.

Set up document - Ledger journal

Design Use a Ledger journal document to import data into the ledger journals.
Compared to the D365 FO document, the Ledger journal document focuses only on the ledger journals. It has ledger-journal-specific properties and behavior.
Based on the document type and journal name, the journal header is created automatically when the message is started, and the document is opened. So, in the document:
  • No records are required for the journal header. Also, no mapping is required in the related message.
  • Only set up records for the journal line tables. All lines in the same import message run are added to the created header. For the records, use tables that are related to the defined journal name. For example, the LedgerJournalTrans table.
The journal name, as defined in the document header properties, defines which ledger journal is applicable. Examples of supported ledger journals are:
  • General journal
  • Invoice journal
On journal creation, the voucher settings of the journal name are considered. If a transaction date is available, the voucher is set.

Set up document - Microsoft Excel

Design

Use a Microsoft Excel document to read data from or write data to a Microsoft Excel file (XLSX).

Set up document - Microsoft Word

Design

In Connectivity studio, use a Microsoft Word document to write data to a Microsoft Word document (DOCX) using a Microsoft Word template (DOTX).
With a Microsoft Word document, you can, for example, add data to text, include contract text, support multi-language output, or include product attributes or specifications. In this way, you can, for example, generate invoices or contracts with the style texts as defined in the template.

To add data to the Microsoft Word document, you can add markers to the Microsoft Word template. You can add markers for document records and for document record fields.

Note: You can use a Microsoft Word document only to write. So, no read options need to be set.

Set up document - ODBC

Design

Use an ODBC document to directly read data from or write data to an external database. You can exchange data with an external database via ODBC or with an external Azure SQL database.

You can also exchange data with another database using CData or DB2.

Set up document - Other journals

Design
Connectivity studio provides you with several other journal-related document types:
  • WMS journal
    Use a WMS journal document to import data into a WMS journal. You can for example import or post a packing slip for a purchase order or an item arrival.
  • Report as finished journal
    Use a Report as finished journal document to import data into the Report as finished journal to report production orders as finished.
  • Production picking list
    Use a Production picking list document to create a production order picking list and post it.
  • Project journal
    Use a Project journal document to import project data into the project journals. You can, for example, import hours or expenses.

Set up document - Staging

Design
Use a Staging document to write data to or read data from the staging table.
You can use the staging table to validate data before you actually process it. For example, you can validate received data before you import it into D365 FO. You can do these validations automatically or manually, depending on the chosen handler. If the validations are:
  • Not met, the journal gets the status Rejected. The found errors or warnings are added to the journal. You can solve these data errors or warnings before further processing the data.
  • Met, the journal gets the status Approved and can be picked up for further processing.

Set up document - Text

Design
Use a Text document to read data from or write data to a file with field and line separators. For example, a CSV file.

Set up document - Trade agreement journal

Design Use a Trade agreement journal document to import data into the trade agreement journals.
Compared to the D365 FO document, the Trade agreement journal document focuses only on the trade agreement journals. It has trade-agreement-journal-specific properties and behavior.
Based on the document type and journal name, the journal header is created automatically when the message is started, and the document is opened. So, in the document:
  • No records are required for the journal header. Also, no mapping is required in the related message.
  • Only set up records for the journal line tables. All lines in the same import message run are added to the created header. For the records, use tables that are related to the defined journal name.
The journal name, as defined in the document header properties, defined which trade agreement journal is applicable. Examples of supported trade agreement journals are:
  • Discount journals
  • Price adjustment

Set up document - XML

Design Use an XML document to exchange data in XML format without the need to code.
With a document of type XML, you can read data from or write data to an XML file.
For more information on XML, refer to XML Tutorial.

Set up EDI type

Design You can use an EDI type to indicate which EDI standard is used. For example, EDIFACT or ANSI X12. This is used for informational purposes only.

Set up endpoint

Design Set up the endpoint that manages the destination the business event is sent to. Business events in D365 FO support several endpoint types.
For more information, refer to Manage business event endpoints.

Set up Environment Comparison Studio connector

Design

Set up a connector of type ECS Azure file storage. Use this type to connect to your Azure file storage location. The connector creates the Environment comparison studio folders in the share that is defined in the Environment comparison studio parameters.  

On the Connector page, in the ECS connector details section, you can view the generated folder paths.  

The ECS Azure file storage connector creates: 

  • The environment-specific parent folder. This folder is created in the Share that is defined in the Environment comparison studio parameters. 

  • A folder for each legal entity in the environment. These folders are created in the environment-specific parent folder. 

  • A Work (ECSWorkFolder) folder for each legal entity. This folder is created in the legal entity folder. 

  • Archive, Error, Export, and Import folders. These folders are created in the Work (ECSWorkFolder) folder. 

Use the ECS Azure file storage connector to manage the paths and to place Environment comparison studio files in the applicable folders. 

 

Folder Description
Archive

The Archive folder of the connector. The folder is used to store used and imported XML files.

  • In Environment Comparison Studio, the XML files that you used for the content package creation will be moved to the Archive folder. 

  • In Connectivity Studio, the resulting XML file will be moved to the Archive folder when you import the file back into D365 F&O. 

Error

The Error folder of the connector. Files are moved to this folder if, during import, errors occurred. 

Export Folder where export messages, with the ECS Azure file storage connector as target connector, store the generated XML files. These files are used to create ECS content packages. 
Import

Folder where the import messages, with the ECS Azure file storage connector as source connector, get XML files to import into D365 FO.

The import message, with the ECS Azure file storage connector as source connector, gets the XML files from the Import folder. 

 

Set up Environment Comparison Studio message

Design

Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message. 

In Environment comparison studio, for a message to: 

  • Export data from D365 FO, the: 

    • Source document type is D365 FO. 

    • Source connector type is D365 FO. 

    • Target document type is XML. 

    • Target connector type is ECS Azure file storage. 

  • Import comparison result data into D365 FO, the: 

    • Source document type is XML. 

    • Source connector type is ECS Azure file storage. 

    • Target document type is D365 FO. 

    • Target connector type is D365 FO. 

Make sure the Key field is added to the source XML document. The key is a constant value that is used to compare records. The Key field is a manual addition and is not part of the table that is used in the message. 

Set up Environment Comparison Studio parameters

Design

Set the general Environment comparison studio parameters that define the storage location for the ECS Azure file storage connector generated folders. 

Set up field mapping condition

Design Sometimes, you want to skip a field mapping when running a message. You can use conditions to indicate if the field mapping must be applied. The field mapping is only applied if the conditions are met. You can set up several conditions for each field mapping.
Examples of values that you can use to define a condition:
  • ..Item1 - Item1 and smaller
  • Item1.. - Item1 and greater
  • Item1,item5..item10 - Item1 or item5 till item10
  • Item1,item2 - Item1 or item2
  • !123 - Not 123
  • “” - Empty
  • !”” - Not empty
  • curExt() - Used to apply a condition with the current company

Set up field mapping condition using an expression

Design Sometimes, you want to skip a field mapping when running a message. You can use a condition that is defined as an expression to indicate if the field mapping must be applied. The field mapping is only applied if the expression condition is met.
You define an expression with self-defined variables. Variables must be linked to fields from the source document or the target document.
Example:
Expression: Value1>Value2Condition value: IF(Value1>Value2, true, false)Variables:
  • Value1 is linked to SalesPrice field of the SalesLine source record.
  • Value2 is linked to LineAmount field of the SalesLine source record.
The field mapping is only applied if the sales price is higher than the line amount.

Set up file action - Azure Blob storage

Design
You can use the Azure BLOB storage file action to move a file between an Azure file share and an Azure BLOB container. You can only use an Azure BLOB storage file action if you exchange files using Azure File Storage. The applicable Azure file share is the one as defined for the connector.
If the direction is:
  • Source, before the message is run, the file is moved from the defined Azure BLOB container to the Azure file share.
  • Target, after the message is run, the file is moved from the Azure file share to the defined Azure BLOB container.

Set up file action - Copy

Design You can use the Copy file action to copy a file to another location.
If the direction is:
  • Source, before the message is run, the file is copied from a defined folder to the Working folder. The original file stays in the defined folder.
  • Target, after the message is run, the created file is copied from the Working folder to the defined folder. The original file stays in the Working folder.

Set up file action - Delete

Design You can use the Delete file action to delete a file from the Working folder. Usually, the Delete file action is the last one to be done.
If the connector is set up properly with a different folder defined for Working path, Archive path, and Error path, a Delete file action is not required for direction Source.
If the direction is Target, after the message is run, the created file is deleted from the Working folder.

Set up file action - Email - D365FO email

Design You can use the Email file action to exchange files with email. This topic explains how to set up a file action to send files using the email setup as defined for D365FO. For more information, refer to Configure and send email.
For D365 FO email, the direction is 'Target. After the message is run, an email is created using the defined email account, the created file in the Working folder is attached to the email, and the email is sent. The original file stays in the Working folder. If you want to zip the file before sending it by email, do a Zip file action previously to the Email file action. In this case the zipped file is attached to the email.

Set up file action - Email - Exchange server

Design You can use the Email file action to exchange files with email. This topic explains how to set up a file action to send or receive files using Microsoft Exchange server.
If the direction is:
  • Source, before the message is run, the defined email account inbox is searched for applicable emails. If found, the attached files of these emails are saved to the Working folder.
  • Target, after the message is run, an email is created using the defined email account, the created file in the Working folder is attached to the email, and the email is sent. The original file stays in the Working folder. If you want to zip the file before sending it by email, do a Zip file action previously to the Email file action. In this case the zipped file is attached to the email.

Set up file action - Email - IMAP

Design You can use the Email file action to exchange files with email. This topic explains how to set up a file action to receive files using an IMAP server.
For IMAP, the direction is 'Source'. Before the message is run, the defined email account inbox is searched for applicable emails. If found, the attached files of these emails are saved to the Working folder.

Set up file action - Email - SMTP server

Design You can use the Email file action to exchange files with email. This topic explains how to set up a file action to send files using an SMTP server.
For SMTP, the direction is 'Target'. After the message is run, an email is created using the defined email account, the created file in the Working folder is attached to the email, and the email is sent. The original file stays in the Working folder. If you want to zip the file before sending it by email, do a Zip file action previously to the Email file action. In this case the zipped file is attached to the email.

Set up file action - FTP

Design FTP is no longer supported because of security reasons. Instead, use file actions of type SFTP.

Set up file action - Move

Design You can use the Move file action to move a file to another location.
If the direction is:
  • Source, before the message is run, the file is moved from a defined folder to the Working folder.
  • Target, after the message is run, the created file is moved from the Working folder to the defined folder.

Set up file action - SFTP

Design
You can use the SFTP file action to exchange files with an SFTP server.
If the direction is:
  • Source, before the message is run, the file is downloaded from the SFTP server to the Working folder. By default, the original file is deleted from the SFTP server. You can disable the delete action of the original file.
  • Target, after the message is run, the file is uploaded from the Working folder to the SFTP server. The original file stays in the Working folder. If you want to zip the file before uploading it to the SFTP server, do a Zip file action previously to the SFTP file action. In this case the zipped file is uploaded to the SFTP server.

Set up file action - Zip

Design You can use the Zip file action to zip or unzip a file in the Working folder. You can use this, for example, in combination with the Email file action to send or receive zipped files using email.
If the direction is:
  • Source, before the message is run, the file is unzipped to the Working folder. The original ZIP file stays in the Working folder.
  • Target, after the message is run, the created file is zipped to the Working folder. The original file is deleted from the Working folder. Next file actions in the file action sequence are done with the ZIP file.

Set up inbound web service website

Design Set up the web site that is required to run the IIS application.

Set up local Windows folders for Azure file storage connector

Design If you use Connectivity studio on a D365 FO (on-premises) environment, you can choose to use local Windows folders to exchange data files.
Create the folders that are required to exchange files. You can create folders that relate to the paths in the Properties section, the Read section, and the File actions section of the connector:
  • Working
  • Archive
  • Error
  • Split
  • Copy
  • Move

Set up local Windows folders for general files

General

If you use Connectivity studio on a D365 FO (on-premises) environment, you can choose to use local Windows folders to exchange data files.

In the local Windows folders, create the required folders. Create folders that relate to the paths in these sections of the Connectivity studio parameters:
  • Version management section: Shared path
  • History section: History report path

Set up mapping condition using an expression

Design Sometimes, you want to skip a record mapping when running a message. You can use a condition that is defined as an expression to indicate if the record mapping must be applied. The record mapping is only applied if the expression condition is met.
You define an expression with self-defined variables. Variables must be linked to fields from the source document or the target document.
Example:
Expression: Price>Amount
Condition value: IF(Price>Amount, true, false)
Variables:
  • Price is linked to SalesPrice field of the SalesLine source record.
  • Amount is linked to LineAmount field of the SalesLine source record.
The record mapping is only applied if the sales price is higher than the line amount.
For more information on expressions, refer to: Expression.

Set up mapping conditions

Design Sometimes, you want to skip a record mapping when running a message. You can use conditions to indicate if the record mapping must be applied. The record mapping is only applied if the conditions are met. You can set up several conditions for each record mapping.
Examples of values that you can use to define a condition:
  • ..Item1 Item1 and smaller
  • Item1.. Item1 and greater
  • Item1,item5..item10 Item1 or item5 till item10
  • Item1,item2 Item1 or item2
  • !123 Not 123
  • “” Empty
  • !”” Not empty
  • curExt() Used to apply a condition with the current company

Set up message

Design

Use messages as the carriers that transport data from a source to a target, based on the mapping as defined on the message.
This topic explains how to set up a message header. On a message header you define:
  • The source information: the connector and document that is used for the data source.
  • The target information: the connector and document that is used for the data target.
  • General settings that define the behavior of the message.

Set up Microsoft Graph API

Design

To connect to SharePoint, the Microsoft Graph API is used. Register an app in Microsoft Entra ID and configure the Microsoft Graph permissions.

The way you set up the app in Microsoft Entra ID, depends on how you want to access SharePoint. You can access SharePoint with:
For more information, refer to:

Set up qualifiers

Design For each EDI type, you can set up qualifiers for an EDI segment. A common segment for which qualifiers are used is the address (NAD) segment. You can have different addresses in the same segment, for example, delivery address and invoice address. Use a qualifier to distinguish these addresses in the EDI file.
Example:
Set up the 3035 (party function code) qualifier for the NAD (name and address) segment with these values:
  • DP; Delivery party; Party to which goods must be delivered.
  • IV; Invoicee; Party to whom an invoice is issued.

Set up record mapping

Design
On the message record mapping, define for each target document record the mapped source document record.

Set up Service Bus queue connector

Design Set up a connector of type 'Service Bus queue'. Use this connector type to exchange information via an Azure Service bus queue or topic.
You can use the 'Service Bus queue' connector to:
  • Export data:
    Use a Connectivity studio export message to send messages to the applicable Service Bus queue or Service Bus topic. The export message must have the 'Service Bus queue' connector as target connector.
    On export, each message that is sent to a queue or topic is logged in the 'Data sent to queue' history table.
  • Import data:
    1. Run 'Receive queue data' to receive messages from a Service Bus queue or a Service Bus topic subscription. The received messages are added to the 'Received data from queue' table in Connectivity studio.
      These receive modes are supported: 'Receive and delete' and 'Peek lock'.
    2. Run import messages to read data from the 'Received data from queue' table and import this data into D365 FO. The import message must have the 'Service Bus queue' connector as source connector.

The related document defines which data is sent to or received from a queue or topic and in which format and structure. So, the document does not result in a file.

With the 'Service Bus queue' connector, you can use these external file-based documents: EDI, Fixed text, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.

Set up Service Bus search definition

Design
You can use Service Bus search definitions to automatically assign Connectivity studio messages to received data records from a Service Bus.

Set up SharePoint

Design In SharePoint, create the folders that are required to exchange files. You can create folders that relate to the paths in the Properties section, the Read section, and the Write section of a connector of type 'SharePoint':
  • Working
  • Archive
  • Error
  • Split/Temporary
  • Temporary

Set up SharePoint connector

Design

Set up a connector of type SharePoint to exchange data files between your D365 FO environment and another environment, using SharePoint. SharePoint is a solution to share and manage content, knowledge, and applications to empower teamwork, quickly find information, and seamlessly collaborate across the organization.

With the SharePoint type connector, you can exchange these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.

Set up staging display options

Design For a document of type Staging, for each document record, you can set up the display options for the related staging journal.
Use the display options to define which fields of the record are shown and editable on the Data tab of the staging journal.

Set up Staging journal connector

Design

Set up a connector of type Staging journal. Use this type to validate and approve data before it is imported into D365 FO.

A staging journal scenario consists of:

  • A message to import the data from the source into the staging journal. This message has a target document of type Staging and a target connector of type Staging journal.
  • A staging journal which is an additional table in which the data is validated and approved before the data is imported into D365 FO.
  • A message to get the approved data from the staging journal and import it into D365 FO. This message has a source document of type Staging and a source connector of type Staging journal.

Set up staging validations

Design

For each document record, you can define data validations to be done when a data record is inserted in the staging journal.

Found validation errors, warnings, or dependencies are stored as validation errors for the relevant staging journal.
For a Staging document, these standard validation classes are available:
Validation class Description
BisValidationFieldisMantadory This class checks if the field is filled. The arguments are: FieldValue and Type. The validation is met if the field has a valid value. For example, for date fields, the value 0 is not valid.
BisValidationReferenceExists This class checks if a record exists in the defined table. The validation is met if at least one record exists in the table. The arguments are KeyFieldName or KeyFieldValue (only define one of these arguments) and TableName. For example, you can check if a customer exists in the CustTable. If the customer does not exist, a validation error is reported.
BisValidationMdmDifference

This class if differences exist between the data that is send from the source to the MDM staging journal and the current data in the target. This validation defines what happens with MDM staging journal lines with data differences.

You can only use this class if you use:

  • The Master data management staging journal.
  • Data comparison on the MDM staging journal.

For more information, refer to Monitor MDM staging journal.

Set up template company in Connectivity studio parameters

General

You can use a template company as source company. If you run a message, the data is exported from the template company and imported in the target company.
In a template company, you can, for example, define generally applicable master data. You can use this to set up the master data for a new company.
To be able to use a template company, in the Connectivity studio parameters, define the template company for each target company. You can define several template companies. For example, with country-specific or industry-specific data.

On connectors of type D365FO, use the Template company field to indicate if a template company is used.
Dimension numbering
If you use several companies, for each company, you can use different financial dimensions. In the Connectivity studio parameters, for each template company, you can assign company-specific numbers to financial dimensions. To make a message company-independent, you can use these numbers to identify the applicable dimensions. So, instead of the dimension name, in the message, you use a number as set up in the Connectivity studio parameters. If the message is run, based on the number, the applicable dimension name is found for the company. Based on the RecId and this dimension name, the dimension value is read or written.
Example: For company A, number 1 refers to dimension Department. For company B, number 1 refers to dimension Customer. For the message, dimension numbering is active. On import with this message in company:
  • A, the field mapping sets the value of the Department dimension.
  • B, the field mapping set the value of the Customer dimension.

Set up test tasks for test project

Design If you have created a test project, set up the tasks for this project. Make sure, you only set up tasks that are used for testing (Run test case is Yes).
To each test task, you can add the applicable messages from the integration or data migration project. Make sure, you only add messages with test cases. So, you use the actual integration or data migration messages for testing!

Set up transformation

Design

If you need a transformation and it does not already exist, you must set it up.
You can use a transformation to change a source value into another value.
You can use these types of transformations:
  • Value: A standard transformation to directly transform a source value into another value. You have these options:
    • Value: The source value is always transformed to another value.
    • Condition: The source value is only transformed to another value if a condition is met. The condition is based on a table field that is defined for the source document record. So, the condition record field must be available in the source document, but it is not required in the message field mapping. Usually, the condition record field is not the field for which you define the transformation in the field mapping.
  • Range: A transformation that is based on a range. All values that meet the range, are transformed to the defined Value to. Example: Range='EN-*' and Value to= 'EN-US'. All values that start with 'EN-' are transformed to 'EN-US'.
Usually, you use either value transformations or range transformations. However, if you use a combination, first the value transformations are checked for an applicable transformation and then the range transformations. The first found applicable transformation is applied.

Set up type conversion

Design

If you need a type conversion and it does not already exist, you must set it up.

You can use a type conversion to convert the data to match the format as required in the target. With a type conversion, you can convert values from any type to string or from string to any type. Usually, the string value is the external value. Note: Type conversions from any type to any type are not supported. For example, a conversion of type integer to type date is not possible.

You can use these conversion types:

 

Conversion type

Description

Text

Define the format in which a text is imported or exported. You can, for example, replace or remove characters, or use one element of the text.

Amount

Define the format in which an amount is imported or exported. You can, for example, define separators and unit conversion.

Date

Define the format in which a date is imported or exported. You can, for example, define the sequence and separator.

Enum

Define the format in which an enum value is imported or exported. You can, for example, define that the enum value is imported or exported as text.

Time

Define the format in which times are imported or exported. You can define the format and the separators to be used.

UtcDateTime

Define in which format a date and time field value is imported or exported. This type combines the Date and Time types.

Date/time format

Define in a flexible way the format in which a date and time field value is imported or exported. You can also include a time zone. Note: The format is case sensitive. For example, the lowercase 'm' is the identifier for minute, and the uppercase 'M' is the identifier for month. Example: dd-MM-yyyy. For more information on how to set up the date/time format, refer to Custom date and time format settings.

Set up Upload and download connector

Design

Set up a connector of type Upload and download. Use this type to:

  • Upload files: On the message, select the connector as source connector. On running the message, you can manually select and upload a file from a folder and import its contents.
  • Download files: On the message, select the connector as target connector. On running the message, a file is created and downloaded to your local downloads folder.

With the Upload and download type connector, you can upload or download these external file-based documents: EDI, Fixed text, Microsoft Word, Microsoft Excel, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.

Set up web service action - Inbound

Design You can use the inbound web service process to receive a request from and send a response to an external application, via an inbound web service.
For each inbound web service action, you can define these messages:
  • Request message: The request message provides D365 FO with data from the external application.
  • Response message: The response message sends the response from D365 FO to the inbound web service.
  • Error message: The error message sends the error information from D365 FO to the inbound web service.
An inbound web service action can only be triggered by an inbound web service. When triggered, the inbound web service action automatically runs the defined messages.

Set up web service action - Outbound

Design

You can use an outbound web service action to request data from an external application and to process the response in D365 FO, via an external web service.

For each outbound web service action, you can define these messages:
Message Description
Request message The request message provides the web service with data from D365 FO.
Response message The response message processes the response from the web service in D365 FO.
Error message The error message processes the error from the web service in D365 FO.

 

Note: Only D365 FO can trigger an outbound web service action. When triggered, the outbound web service action automatically runs the defined messages.

Attributes

You can use attributes to add extra information to the web service URL or HTTP request.
The attribute type indicates how the attribute value is defined. You can use these types:
Attribute type Description
Value The attribute is a fixed value. Enter the fixed value in the Value or Custom field.
Document field The attribute value is derived from a field of the source document of the request message. Fill in the Document field field.
Record field

The attribute value is derived from a field of a selected record. Usually, this type is used to get records. Only use this type if you start the web service with a menu item from a specific page. Fill in these fields: Record table and Record field.

Example: You start the web service from the Sales orders page. You can use the attribute to get all sales orders for the customer of the selected sales order. In this case, you fill in the CustTable and ID.

Custom You can enter a static method that defines the range. The static method is applied to the source document of the request message.
Secret You can enter a secret reference to be used as attribute. The secret reference refers to a centrally stored secret which makes updating secrets easier. So, the secret value is not visible on the Web service action page. You only see the secret reference. Fill in the Secret reference field.

 

The attribute styles define how the attribute is applied to the request. You can use these attribute styles:

Attribute style Description
Header Sends a custom header with an HTTP request. The attribute is added to the header of the HTTP request.
Query

Most common attribute type. It applies to the whole request. It is added to the URL after the question mark (?) after the resource name.

Example: https://myserver.com/resource?attr1=Your Value&attr2=Your Value

Template

Parameterizes the resource path, adding a placeholder for a variable value.

Example: https://myserver.com/resource/{attr3}

Matrix

Applies to a specific resource path element. The attribute is added to the URL, between the resource or the template attribute and the QUERY attributes. The attribute is separated from the resource or the template attribute with a semicolon (;).

Example: https://myserver.com/resource/{attr3};;attr4=Your Value?attr1=Your Value

Plain Excludes the attribute from the HTTP request. For example, for testing purposes.
Body key pair Usually, for an outbound web service, the body contains the content. In some cases, for example for Dataverse, the body contains more data than only the content. The data is split in a list of, so called, key pairs. In this case, the content is stored in a key pair, instead of in the body. For each of the key pairs to be added to the request body, add an attribute to the outbound web service action.
To use key pairs in your body, use these settings:
  • Content type: application/x-www-form-urlencoded
  • Attribute type: Value
  • Attribute style: Body key pair
  • Attribute value: [Body]; only for the attribute that contains the request content.
Note: You only can apply this setup if the target document of the request message is of type JSON.

Set up web service application for project

Analysis
For an outbound web service, a connection must be established with an external web service.
To connect to the external web service, for the applicable project, set up the web service application.
The web service application defines the base URL for the related web service actions.
If required, you can also set up the user authentication to get access to the external web service.

Set up Web service connector

Design

Set up a connector of type Web service. Use this type to exchange data via a web service using a stream.

If you:

  • Import data, the web service adds its data to a stream. From this stream, the data is imported into D365 FO.
  • Export data, the data is exported from D365 FO and added to a stream. The web service takes the data from this stream and processes it.

The related document defines which data is added to or taken from a stream and in which format and structure. So, the document does not result in a file.

With the Web service type connector, you can use these external file-based documents: EDI, Fixed text, Text, XML, JSON. You can only use this connector in combination with a document for which the version 3 (V3) handler class is selected.

 

Set up web service user

Design
Set up the external users that are allowed to access the inbound web service to start an inbound web service action.
For each external web service user, define:
  • The allowed inbound web service actions.
  • The companies in which the web service action is allowed to run.
If the inbound web service application (IIS application or Azure Logic Apps) receives an HTTP request from an external application, it calls a D365 FO method to check:
  • If the external user, as defined in the HTTP request, is defined as web service user.
  • If it is allowed to start the web service action for the web service user.
  • If it is allowed to run the web service action for the web service user in the company, as defined in the HTTP request.

Show Infolog for message run

Operation For each message run with errors, you can open the Infolog and read the related error messages.

Show where a secret reference is used

Design

You can show the records where a secret reference is used. This can be helpful, for example, if you want to change a secret and you want to see which records are involved.

Solve connection issue for Azure file storage connector

Design

For an Azure file storage connector, you can connect to an Azure file share. To access the Azure file share, you can mount the Azure file share.

For most connectors, when a message is run, a connection is made only for the run. After the run the connection ends. However, for an Azure file storage connection with a mounted Azure file share, the connection stays, independent of messages run.
When a related message is run or the connection is tested, a check is done, and an error can occur if something has changed. For example, the connector username or password is changed. In this case, the mounted connection to the Azure file share is no longer valid. To reset the mounting, first remove the connection (see steps). The next time, a related message is run, or the connection is tested, the mounting is restored, for example with the proper username or password.
Usually, messages are run in batch. If you run a message in batch, you can use a batch group to direct a batch task to another server. If a message runs on another server, the connection also stays on the other server. In case of an error or change, this connection is no longer valid. So also, the connection on the other servers must be removed.

Solve errors - File actions

Operation

Solve the file action errors that occurred during the message run.

Solve errors - Message run

Operation

Solve the errors that have occurred in the message run.

Solve validation errors and warnings

Operation If you use staging in your inbound integration process, the records that you receive are stored in the staging order journal.
These records are validated according to the applicable journal validation setup. If the applicable validation rules are not met, an error or warning is given. Before the records can be processed further, review the errors and warnings and take appropriate actions.
If journal validation errors or warnings are given, you have these options:
  • Solve the errors or warnings.
  • Cancel the records with errors.
  • Accept the records with warnings.
  • Cancel the journal.
You cannot accept staging journals or lines with errors. If you do so, and approve the staging journal, the journal or line with errors is again set to Rejected.

Source connection is made

Operation Based on the source connector, a connection is made with the data source of the message.

Source data is retrieved

Operation Based on the setup of the source document, the source data is queried. The found data is input for the mapping as setup for the message.

Split logged events over pages

Operation
To improve the performance when processing a lot of logged events, you can use paging. For paging, the logged events are split over several pages that are run in parallel batch tasks.
You define the paging setup in these fields of the Process data synchronization log dialog:
  • Number of pages: It can be that you can run only a limited number of batch jobs in parallel or that you want to use a limited number of batch jobs. If so, define the maximum allowed number of pages. 
  • Page size: You can split the to-be-processed events over several batch task. Define the number of events to be processed by one batch task.
On the pages, the events are grouped by RecId, table, and event type. So, all events of a specific event type that are logged for a unique record, are grouped on the same page. For example, if five change events are logged for customer X, these five events are grouped on one page.
Example:
The Number of pages is 4 and the Page size is 1000.
- If 1.500 logged events are processed, these are split over 2 pages.
- If 10.000 logged events are processes, these are initially split over 10 pages. However, only 4 pages are allowed. Therefore, a recalculation is done splitting the 10.000 events over the 4 allowed pages. As a result, 4 pages with each 2.500 events are processed.
Notes:
  • If you do not use paging, all to-be-processed events are added to one page and processed by one batch task.
  • When selecting logged events to be processed, the synchronization parameters are considered:
    • Last run: Only the events are selected with a created date and time after the latest run date and time.
    • Synchronization delay (minutes): Only the events are selected that are logged before the current processing date/time minus the defined number of minutes.

Take over project checkout

Analysis If a project is checked out in another environment, you can take over the checkout in the current environment. Only do so if, for some reason, the project cannot be checked in in the other environment.
As a result, the project:
  • Is blocked for changes in the other environment.
  • Changes that are made in the other environment are undone.
  • Is checked out in the current environment.
  • Gets a provisional version number in the current environment that is two higher than the latest version. Example: The latest version of a project is '4'. When checked out in the other environment, it gets the provisional version number '5' in the other environment. If you take over the project in the current environment, it gets the provisional version number '6' in the current environment.

Target connection is made

Operation Based on the target connector, a connection is made with the target of the message.

Test data synchronization log processing

Design

For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the Data synchronization log.

When the logged events are processed, for each applicable message or web service action, a record is added to the Outbound queue.

If logged events in the data synchronization log are not processed to the outbound queue, you can:

  • Check if processing the data synchronization log is set up correctly.
  • Test the data synchronization log processing, which is explained in this topic.

When tested, a logged event is processed to the outbound queue. If processing a logged event goes:

  • Fine, a record is added to the outbound queue. But, different from normal processing, the logged event stays in the data synchronization log.
  • Wrong, no record is added to the outbound queue.

When the test is finished, you can view the processing steps and result in the message details. For each message or web service action that is subscribed to events of the same table, you can view the processing steps. In case of errors, verify the processing steps to see where the issue occurred.

This picture is an example of message details, with an explanation of the different parts:

Explanation:

  1. The type of the logged event.
  2. The message or web service action that is subscribed to events of the table.
  3. The checks that are done to see if the logged event is valid for the message:
    • In case of an update: Is the field for which a value is changed defined in the source document of the message?
    • Is the company for which the event is logged defined as source company for the source connector of the message?
  4. In the example, on the message, for the data synchronization setup, the Redirect event field is set to Yes. This means, on processing the data synchronization log, if the event is logged for a child record of the source document, it is added to outbound message queue for the root record of the source document. Example: The source document has a root record 'Sales order' and a child record 'Sales line'. A table event is logged for a sales line. The sales order record is added to the outbound message queue instead of the logged sales line record. As a result, the sales order and all its sales lines are exported by the message.
  5. The message if no record is added to the outbound queue. In no record is added to the outbound queue, also a Query info section is shown. This section shows the query that is applied.
  6. A summary of the result of the test.

Test document

Design

If you run into an issue with a message, you can separately test the source and target documents.

The document type defines what is tested:
  • External document - File based: Reads the data from a defined file and tests the document query.
  • External document - ODBC: Reads the data from an external database and tests the document query.
  • Internal document: Tests the document query.
As a result:
  • The records that are read from the file or from the external database, are shown on the Imported records page.
  • When you test a document, based on the document setup, a query is created that reads the data. The result of the query test is shown in the message bar. If the 'No data found' message is shown, the document is not set up properly. In the message bar, you can click 'Message details' to see more detailed information.
Note: Before you test an external document, make sure, the document process type is set to 'Query'. After testing the document, if the original process type was 'Direct', do not forget to change the process type back to 'Direct'.

Test inbound Azure Logic App

Design Before you deploy the inbound Azure Logic App, test it.

Test inbound web service action

Design

You can test an inbound web service action without receiving an HTTP request from the external application. So, you only test the inbound web service action setup and not the full process with connection to the external application.

When testing, the inbound web service action does run the request message and response message. So, data can be impacted. Therefore, you are advised to only test an inbound web service action in a Development or Test environment.

For the result of an inbound web service action test, view the Result section on the test page. You can also view the message history of the related request message and response message.

Test outbound Azure Logic App

Design Before deploying the outbound Azure Logic App, test it.

Test outbound web service action

Design

You can test an outbound web service action without sending a request to the external web service. So, you only test the outbound web service action setup and not the full process with connection to the external web service.

When testing, the outbound web service action does run the request message and response message. So, data can be impacted. Therefore, you are advised to only test an outbound web service action in a Development or Test environment.

For the result of an outbound web service action test, view the message history of the related request message and response message.

Troubleshoot data synchronization log issues

Operation

You can use table events to log data changes. You can define, for each record, which table events are logged. The table events are logged in the Data synchronization log.

If table events are not logged in the Data synchronization log, check the data synchronization setup for your message or outbound web service action.
You can also check if processing the data synchronization log is set up correctly.

Troubleshoot staging issues

Operation

In Connectivity studio, you can use the staging concept to validate data in an intermediate area before it is further processed. Usually, the issues as shown in the staging journal are data related.

If it appears that issues are not data related, extend your issue investigation to the messages that import data to or export data from the the staging journal.

Troubleshoot trigger issues

Operation

An integration can be triggered in several ways. If a trigger fails, no errors are shown in the Connectivity studio Integration operations history.

Batch job

You can run a project, message or web service in batch, usually in recurring mode.
If the message or web service does not run at all, the batch server can be down.

Custom code

You can run a message from custom code.
If you use custom code to run a message, and the message does not run, probably, the custom code is not right. You need a developer to check and repair the custom code.

Update secret reference name

Design

You can change a secret reference name. this automatically updates the secret reference in all places where it is used.

You can update a secret reference name, for example, after you upgraded from local secret storage to central secret storage. In this case, you can change the automatically created secret reference names.

Upgrade secrets to the secret reference tables

Design

For each project, you can migrate from 'locally' stored secrets to centrally stored secrets. To do so, you can automatically collect the locally stored secrets and store these in the centrally stored secret references.

During upgrade:

  • All locally stored secrets of the project are collected. Note: Also, the Azure storage password from the Connectivity studio parameters is collected and upgraded.
  • For each unique secret, a secret reference is created with an automatically generated name. So, no duplicate secret references are created. Note: A secret reference is unique for the secret and environment type combination.
  • For each record with a 'local' secret, the secret is removed, and the newly created secret reference is linked to the record.

Note: Usually, you only use this upgrade function once per project during migration from locally to centrally stored secrets.

Use file explorer

Operation

You can use the File explorer to quickly access and view the share as defined for the applicable Azure file storage connector. The File explorer offers limited functionality. You can only view, copy, move, or delete files.

Verify test results

Design When a test case is run, the results are stored as test case run. For each test case run, you can review the test results, and if applicable do the manual tests.
The status is shown in these ways:
  • Test step status in the Result section:
    The test step status can be automatically set to:
    • Passed: If the test step result matches with the expected status, the test step Status is set to Passed.
    • Failed: If the test step result does not match with the expected status, the test step Status is set to Failed.
    • New: The mapping cannot be done because there is no value to be set.
  • Test case run status in the Test case run section:
    The test case run status field can automatically be set to:
    • Passed: If for all test steps, for all tested records, the status is Passed, the test case run status is set to Passed.
    • Failed: If for at least one of the steps, for at least one of the tested records, the status is Failed, the test case run status is set to Failed.
    • No data: If the test case cannot be done because no data was available to be processed, the test case run status is set to No data.
A test case can fail, for example, caused by errors or by a test step that must be tested manually. You can review a failed test case and, if applicable, manually test the manual test steps.
After reviewing and manual testing, you can manually change the test case run status in line with your findings.

View error report

Operation

You can view a report in Microsoft Excel format that contains the errors that occurred during a message run. You can, for example, use this to inform the sender of data on the errors.

You can create an error report:
  • Automatically: To do so, on the message header, select 'Yes' in the 'Create history report' field. The report is created when the message run is finished.
  • Manually: To do so, on the Connectivity studio Integration operations workspace, on the History errors tab or History tab, select a record, and click Create history report.
The report is stored in the folder as defined in the Business integration parameters, in the History report path field. This is a folder in the file share as defined in the Windows share field.
To view a history report, open the Azure file share or Windows folder, as defined in the Windows share field on the Business integration parameters.
You can find the report based on the message name. The name of the history report is composed in this way: Message [message name]_[number sequence].

View file history

Operation
You can review and analyze the file history of message runs for messages with an Azure file storage connector. All actions that are done to the related file, are registered in the file history.
Registered file actions are the:
  • Standard file actions related to the Working, Archive, Error, and Split folders. For example, on import, a file is read from the Working folder and when successfully processed, it is moved to the Archive folder. Both the 'read' and the 'move' action are registered.
  • Additional file actions as set up for the Azure file storage connector.

View message run history

Operation

You can review and analyze the history of message runs.

View message run record history

Operation For each message run, the processed records with errors are shown. For each shown record, you can view the record details. If required, you can change the values to be set.

View messages on Service Bus queue or topic subscription

Operation

You can view the messages on a Service Bus queue or topic subscription for a specific connector.

When you click 'Show Service Bus data', the related queue or topic subscription is read, and the results are shown on the 'Show Service Bus data' page.
Note: The read Service Bus data is not stored. Each time you open the page, the data is reloaded.

View original message run history

Operation If a message run is run before the currently shown run, you can view the history of the previous message run.

View outbound queue

Operation For messages and web service actions, you can use table events to track data changes. You can define, for each table, which table events are logged. The table events are logged in the data synchronization log. On processing the data synchronization log, based on the logged events, records are added to the outbound queue. On processing the outbound queue, for each record, the related message or web service action is run to export the applicable data.
This topic explains how to view the outbound queue.

View project version log, status, or files

Analysis To monitor project versions, on the Projects page, you can show the project version:
  • Status
  • History log
  • Files

View staging journal

Operation

Use the staging journal to monitor the staged records.

If you use staging in your inbound process, you receive data with message that stores the data staging journal.
If you monitor staging journals, the statuses are important. Refer to 'See also'.

View table relations

Design
When you set up an internal document, it can be useful to view the D365 FO table relations without accessing the development environment.
You can view the relations of the table as defined for the current record.

View where-used

Design
Sometimes, if you want to clean-up your document setup, you cannot delete an element. In such a case, you can view where the element is used in the connectivity setup.
You can do so for a:
  • Document
  • Document record
  • Document record field

Web service error message is run - Inbound

Operation

When in an inbound web service process, the request message did not run successfully or an error occurred, the inbound web service action runs the error message, if defined. This is managed by the handler class as defined for the web service action.
The error message sends the error information from D365 FO to the inbound web service. For example, you can use an error message to change the status of a record in the external application.

If the HTTP action is Post, Put, Delete, Patch, or Post or Put, the errors in the message history of the request message define the input of the response message.
For an error message the:
  • Source connector must be of type D365 FO.
  • Source document must be an internal document.
  • Target connector must be of type Azure file storage or Web service.
  • Target document must be an external file-based document.

Web service error message is run - Outbound

Operation
When in an outbound web service process, an error is received from the web service, the outbound web service action runs the error message, if defined. This is managed by the handler class as defined for the web service action.
The error message processes the error from the web service in D365 FO. For example, you can use an error message to change the status of a record in D365 FO.
For an error message the:
  • Source connector must be of type Azure file storage or Web service.
  • Source document must be an external file-based document.
  • Target connector must be of type D365 FO.
  • Target document must be an internal document.

Web service request message is run - Inbound

Operation

When triggered, the inbound web service action first runs the request message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the request message depends on the HTTP action of the web service action. In general, the request message provides D365 FO with data from the external application.

If the HTTP action is:
  • Post, use the request message to provide the data to be created in D365 FO. 
  • Put, use the request message to provide the data to be updated in D365 FO.
  • Get, no request message is used. The data to be get is defined by arguments. So, if the HTTP action is Get, set up the relevant arguments.
  • Patch, use the request message to provide the data to be updated in D365 FO. Use this to only update a part of a record. For example, only update a contact person of a customer.
  • Delete, use the request message to provide the data to be deleted from D365 FO.
  • Post or put, use the request message to provide the data to be created or updated in D365 FO.
For a request message the:
  • Source connector must be of type Azure file storage or Web service
  • Source document must be an external file-based document.
  • Target connector must be of type D365 FO.
  • Target document must be an internal document.

Web service request message is run - Outbound

Operation

When triggered, the outbound web service action first runs the request message, if defined. This is managed by the handler class as defined for the web service action.

The goal of the request message depends on the HTTP action of the web service action. In general, the request message provides the web service with data from D365 FO.
If the HTTP action is:
  • Post, use the request message to provide the web service with the data to be created in the external application. 
  • Put, use the request message to provide the web service with the data to be updated in the external application.
  • Get, you can use the request message to provide the web service with search criteria to get the desired data. If no request message is defined, the web service is triggered based on the resource. In this case, the web service defines which data is sent back. Note: If the web service access type is:
    • SOAP: The output of the request message is added to the request in a SOAP envelope. In this case, make sure this is properly set up in the target document and the field mapping of the request message.
    • REST: The output of the request message is used to parameterize the URL.
  • Patch, use the request message to provide the web service with the data to be updated in the external application. Use this to only update a part of a resource. For example, only update a contact person of a customer.
  • Delete, use the request message to provide the web service with the data to be deleted from the external application.
  • Post or put, use the request message to provide the web service with the data to create or update data in the external application. To determine if the data must be created or updated, on the web service action, an attribute must be defined with the Key check box selected.
For a request message the:
  • Source connector must be of type D365 FO.
  • Source document must be an internal document.
  • Target connector must be of type Azure file storage or Web service.
  • Target document must be an external file-based document.

Web service response message is run - Inbound

Operation

When in an inbound web service process, the requested data is received from D365 FO, the inbound web service action runs the response message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the response message depends on the HTTP action of the web service action. In general, the response message sends the response from D365 FO to the inbound web service.

If the HTTP action is:
  • Post, Put, Delete, Patch, or Post or Put, you can use the response message to process the answer to the inbound web service. The RecIds that are defined in the request message, are the input for the response message.
  • Get, use the response message to process the requested data from D365 FO to the inbound web service. The arguments that are defined for the inbound web service action, define the input for the response message.
For a response message the:
  • Source connector must be of type D365 FO.
  • Source document must be an internal document.
  • Target connector must be of type Azure file storage or Web service.
  • Target document must be an external file-based document.

Web service response message is run - Outbound

Operation

When in an outbound web service process, the requested data is received from the web service, the outbound web service action runs the response message, if defined. This is managed by the handler class as defined for the web service action.
The goal of the response message depends on the HTTP action of the web service action. In general, the response message processes the response from the web service in D365 FO.

If the HTTP action is:
  • Post, Put, Delete, Patch, or Post or Put, you can use the response message to process the answer from the web service.
  • Get, use the response message to process the requested data, as received from the web service, in D365 FO.
For a response message the:
  • Source connector must be of type Azure file storage or Web service.
  • Source document must be an external file-based document.
  • Target connector must be of type D365 FO.
  • Target document must be an internal document.

Provide feedback