A colleague developing for On-Prem Business Central using VS Code ran into an Authentication error downloading symbols. Although he provided his domain user and password the authentication failed.
Authorization has failed or the credentials have expired. The credential cache has been cleaned. Any access to reach Business Central would require new authorization.
Symptoms
Business Central On-Prem installation on Domain Joined Server
Authentication Windows
Visual Studio Code installed on a non-domain joined Windows 11 client (WORKGROUP)
Work Account (Azure AD / Entra ID User) logged on on in Windows 11
Open URL in Browser asks for User + Password but works correctly ๐ค
AL: Download Symbols reports an error
Solution: RunAs
A workaround that solved the issue was to use the RunAs command and provide the domain user + password.
Since Tier1 Cloud Hosted Environments (CHE) will be deprecated, there are still good reasons (๐ฐ) to use them anyway instead of UDEs. Tier1 CHE from withing Lifecycle Services can be paired with Power Platform environment including Dual Write Configuration with Dynamics 365 Sales. In the Tier1 configuration I’ve encountered an authentication error at creating new records in FO.
History and symptoms ๐ค
I’ve deployed a new Tier1 Dev/Test Environment 10.0.43 to the linked Azure Subscriptions. At the configuration dialog in LCS I’ve enabled Power Platform integration based on Dynamics 365 Sandbox. Deployment took a while but succeeded. ๐
After the deployment I’ve restored a AxDB database with some basic configurations and performed a full database synchronization in Visual Studio. ๐
Next I’ve linked the D365 Finance and Supply Chain machine with the deployed D365 Sales Sandbox environment from the LCS environment details page. Finally I’ve enabled the Dual Write Configuration also from the LCS environment details page. ๐
In Power Platform Admin Center, Environment, Dynamics 365 Apps two solutions were already deployed. Dynamics 365 Dual Write Core and Dynamics 365 Dual Write Application Core. Because the environment was a sales prototype I’ve added the Dynamics 365 Dual Write Human Resource, Dynamics 365 Dual Write Finance, Dynamics 365 HR Common Tables (!), Dynamics 365 Dual Write Global Address Book and Dynamics 365 Dual Write Supply Chain Solution from the App Source. ๐
In Dynamics 365 Finance & Supply Chain at the Data Management Workspace I’ve imported the Core Solution and Supply Chain Solution. The table mappings have been populated successfully and I’ve choosen to synchronize only one legal entity (company) with Sales. ๐
The basic table mappings (e.g. Legal Entities, Customer Groups, Currencies, Units, Size, Colors, etc.) including initial synchronization from FO to Sales were successful. I’ve also enabled synchronization between CustomersV3 and Accounts๐
In Dynamics 365 Sales it was possible to create a new account from type customer, link it to the corresponding legal entity and assign a customer group. The customer account from Dynamics 365 Sales was successfully synchronized into FO within seconds and became a customer. ๐
Vice versa, from Dynamics 365 Finance Supply Chain to Dynamics 365 Sales did not work. As soon as a new records in a synchronized table was create a Dual Write error message came up. No matter if it was a customer, product, etc. ๐คฌ
Unable to write data to entity accounts
Authentication failed with error
Failed to authenticate for https://__crm4.dynamics.com/api/data/v9.0/accounts
For information on troubleshooting see
https://go.microsoft.com/fwlink/?linkid=2244045
Unable to get access token
Solution Step-by-Step
Microsoft has cut off CHE connections from accessing tenant information. I found some blog posts from other folks dealing with Business Events and a Microsoft documentation to setup Onebox environments. Following the instructions I’ve done the following:
App Registration
I’ve created an App Registration in EntraID. I’ve added the URL of the D365 FO Environment as response URL from type Web as well as the URL including /oauth
I gave API permissions to User.Read, User.Read.All, Dynamics ERP AX.FullAccess and LCS user_impersonation.
Self-Signed Certificate via PowerShell
Following the documentation I’ve create a simple self-signed certification via PowerShell on the Tier1 VM.
The export generates a .cer file. By double clicking the file you can install the file on the computer. I’ve done this twice. As local machine and choosen the place to be the root certificate authorities. The same with installation as user and also selected the installation to be placed in the root certificate authorities.
web.config and wif.config
The web.config file can be found at K:\AosService\WebRoot directory. I’ve made a copy before editing and changed the following lines. The spn: is the application ID from the app registration. The thumbprint can be found in the .cer file on the Details tab.
In the wif.config I’ve also added the application ID
<?xml version="1.0"?>
<system.identityModel>
<identityConfiguration>
<securityTokenHandlers>
<securityTokenHandlerConfiguration>
<audienceUris>
<!-- WARNING: MUST be first element; updated at web role instance startup -->
<add value="spn:00000015-0000-0000-c000-000000000000" />
<add value="spn:MY_APP_ID_HERE" />
Application User in Dynamics 365 Power Platform
In the Power Platform Admin Center, on the environment page, under settings in groups users I’ve added an application user. I’ve assigned System Administrator Role like the already existing Finance and Operations Service account.
EntraID Application in FO
Finally to avoid any additional problems I’ve also create a new record in Dynamics 365 Finance Supply Chain at the EntraID Applications page.
Finally the synchronization from Dynamics 365 Finance Supply Chain to Dynamics 365 Sales was working. Customers, Products, Sales Orders, Invoices etc. can be created in FO and be found Sales. ๐
Since the original Dual Write Orchestration packages has been replaced by a pack of smaller packages, its needed to setup the packages in the correct order. You can deploy a Power Platform environment using the Dynamics 365 template from within LCS.
Also from the Environment Full Details page, you can trigger the installation of Dual Write solution and initial configure the Dual Write solution. You will end up with Dual Write Core and Dual Write Application Core Solutions installed. To have the full Dual Write Experience you have to install the other Packages as well.
You might run into an installation error when deploying Dual-Write packages from App Source.
Error details
How to troubleshoot
To fix this problem retry installation. For directions on how to do this see here.
msdyn_Dynamics365SupplyChainExtendedAnchor
NotProvided
Solution manifest import: FAILURE: The following solution cannot be imported: Dynamics365SupplyChainExtended. Some dependencies are missing. The missing dependencies are : <MissingDependencies><MissingDependency canResolveMissingDependency="True"><Required type="1" schemaName="cdm_worker" displayName="Worker" solution="HCMCommon (2.2.0.3)" /><Dependent type="10" schemaName="msdyn_transferorder_receivingcontactpersonnel" displayName="msdyn_transferorder_receivingcontactpersonnel" parentSchemaName="cdm_worker" parentDisplayName="Worker" /></MissingDependency></MissingDependencies> , ProductUpdatesOnly : False
There is also an additional required package that is not labeled “Dual Write” but required, the Dynamics 365 HR Common Tables package:
Here is an order of installation that worked for me (Apr. 2025)
The combination of Dynamics 365 Finance and Supply Chain Management Cloud ERP and Microsoft Flow helps to automate many tasks. For example, you can easily notify a customer about changes regarding the confirmed shipping date.
Concept
Configure a Blob Storage Endpoint for Business Events
Create a Change Based alert for ShippingDateConfirmed
Add a flow that triggers on new events
Read Email from sales order and notify customer
Configure a Business Event Endpoint
Business events in Dynamics 365 FSCM can be used to notify external systems. It supports different endpoint types like Azure Event Hub, HTTPS web hook and Blob Storage Account. I personally prefer to use a storage account because it’s a very cheap and easy to use, understand and support cloud infrastructure.
In Entra ID admin Portal (Azure Active Directory) create a new app registration. Note the client ID and create a new secret. Note the application secret as well.
In order to use a Blob Container for Business Events you need some resources. First, of course a storage account with a public endpoint. Copy the Storage Account Connection String. On the menu on the left side select Storage browser. Navigate to the Blob Storage and create a new container.
Next, create a key vault to store the connection string. When creating the key vault make sure to use Vault Access Policies. At the key vault create a new secret and place the connection string there.
In the Key Vault switch to Access Policies and create a new one. Assign the registered app the rights to list and get secrets.
In Dynamics 365 Finance and Supply Chain Management open the Business Event Catalog (System Administration > Setup > Business Events). Switch to the Enpoint tab and create a new Blob Endpoint. In the dialog provide
the name of the secret that holds the connection string
Switch to the Business Event Catalog and filter the Business Event ID entries containing the term “Alert”. Make sure you select the BusinessEventsAlertEvent and click on Activate. In the dialog select the legal entity and the recently created blob endpoint.
Test Business Event Endpoint configuration for Alerts
Make sure you have a batch job handling change based alerts in Dynamics 365 Finance and Supply Chain. If you don’t have such a batch job, create on from System Administration > Periodic Tasks > Alerts > Change Based Alerts. Change the recurrence to no end date and provide a time interval e.g 10 minutes.
In Dynamics 365 FSCM go to an existing sales order or create one. In the top menu switch to Options and select Create custom alert.
In the alert dialog choose the Confirmed Ship Date from the field drop down. This will change the alert trigger to Has Changed. Make sure to activate the Send Externally Option as well. Save the alert rule.
Change the confirmed ship date in the sales order. Depending on the time interval for change based alerts batch job you will get notified that the value has been changed.
Switch to the Azure Portal and go to your storage account. From the Storage Browser, select the Blob Storage and the container you created for the business events. There you should see at least one entry named as GUID.
Download and open the file in a text editor. I should contain the JSON for the business event. You will find the Sales Order Number in KeyValue1 and the legal entity in the DataAreaId property. You can use this values to lookup the sales order in D365.
Create a flow to notify the customer
Go to Power Automate and create a new flow that triggers when a blob entry is created or modified. Check if the event was a shipping data change and send an email to the customer. The flow may look like this.
The first action Get Blob Content is used to download the event file itself. The next action will parse the event JSON string. Since the blob file has no file extension, it is necessary to provide the content as string() to the Parser. The schema can be generated by example. E.g. copy the JSON string from the test file and flow will generate the schema.
Because the blob storage account may be used by different business events in the future it is advised to add a condition to check if the alert is triggered by the shipping date.
Next use the DataAreaId and KeyFieldValue1 to lookup the Sales Order in Dynamics 365 FSCM by combining both values seperated with a comma e.g. demf,001234
Add a second condition to check if there is an Email address provided in the sales header. If so use the Send Email action to notify the customer. If required you may work on the FALSE condition and lookup the customer instead to find an email address.
There are many ways to analyze Dynamics 365 Finance and Supply Chain data in Fabric. One way is to replicate the Entity Export database. Then load it into a Lake House and combine it with additional data for reporting. Here is a video that shows the complete process from scratch:
Considerations
Exporting entities in Dynamics 365 Finance and Supply Chain to your own Azure SQL DB has been around for a long time. There are some considerations and limitations when using your BYOD (Bring Your Own Database) feature with Fabric:
Sizing: 100+ DTUs are required. Free and small Basic versions are not supported.
Networking: Public Endpoint and allow Azure Services
Load Dynamics 365 F/SCM in Synapse and visualize in PowerBI
Synapse link for Dataverse is the replacement for Export to Datalake which has been deprecated in 2023. Although it is called link for Dataverse, it can be used to access Dynamics 365 Finance and Supply Chain tables (and CE data from dataverse as well).
SQL Track Changes has to be enabled in D365 F/SCM. Creates, Updates and Deletes are written to a storage account in CSV format. Synapse is running a spark pool that converts the CSVs into Deltalake format which relies on the (Open-Standard) Parquet format. As result you can see and query the F/SCM tables in the lake like it were tables in a relational database.
Good news, Synpase has a serverless SQL pool and a public SQL endpoint. You can find the SQL endpoint from the Manage Icon (down left) > SQL Pools > Select your pool > Workspace SQL Endpoint.
You can create databases in this SQL pool and create views based on the tables in the datalake. For example join CustTable and DirPartyTable and provide a view that contains customer data with a name and address.
You can use the development workspace in Synapse web based workspace but you could also use other tools to connect e.g. SQL Server Management Studio. Tables in the datalake and views in another database can be accessed.
PowerBI has a built-in connector for Synapse Workspace. You can easily load data from the tables and also from the views.
When working on PowerBI projects for a longer time or supporting a customer, version control would be a desireable feature. In many cases PowerBI files are stored on file share or SharePoint. At this moment (August 2023) there is no integrated version control feature in PowerBI Desktop. But we can use featues from different software products to build a version control strategy for PowerBI.
There is a preview feature call “Save as PowerBI project” in PowerBI Desktop. This will split the PowerBI report into multiple files that contain the model definition, the report layout and some more files. Now that we have multiple files in a project folder, one can come up with the idea to put these files under a version control system.
You can use Git as version control system on your local PC or wherever the PowerBI reports are developed. Git has a local repository and can be connected to central repository. In Azure DevOps you can setup projects using Git as version control system. Connect your local PowerBI Git repository with Azure DevOps to manage your PowerBI report development.
There are many ways how to access, import and export data in Dynamics 365 Finance & Supply Chain Management. Find here a one page PDF summary of 12 common ways. Every solution has it pros and cons.
Microsoft has recently released the new VHD for Dynamics 365 Finance and Operations 10.0.24 to download from LCS. When you instantly try to execute the Admin Provisioning Tool and provide your domain user you will get an error.
The value’s length for key ‘password’ exceeds it’s limit of ‘128’
There are some steps required before you can assign it to your Domain:
Dynamics 365 Finance & Supply Chain allows you to store attachments in the database, on Azure BLOB and on SharePoint. See here how you can configure SharePoint as storage location and create a document type “Fact Sheet”: