VSCode AL: Download Symbols Authorization has failed

A colleague developing for On-Prem Business Central using VS Code ran into an Authentication error downloading symbols. Although he provided his domain user and password the authentication failed.

Authorization has failed or the credentials have expired. The credential cache has been cleaned. Any access to reach Business Central would require new authorization.

Symptoms

  • Business Central On-Prem installation on Domain Joined Server
  • Authentication Windows
  • Visual Studio Code installed on a non-domain joined Windows 11 client (WORKGROUP)
  • Work Account (Azure AD / Entra ID User) logged on on in Windows 11
  • Open URL in Browser asks for User + Password but works correctly ๐Ÿค”
  • AL: Download Symbols reports an error

Solution: RunAs

A workaround that solved the issue was to use the RunAs command and provide the domain user + password.

runas /user:AD_DOMAIN\AD_USER "C:\PATH_TO_VSCODE.EXE"
Download Symbols Successful

Dual Write Error “Failed to authenticate” and “Unable to get access token” from Tier1 Cloud-Hosted to Dynamics 365 Sales

Since Tier1 Cloud Hosted Environments (CHE) will be deprecated, there are still good reasons (๐Ÿ’ฐ) to use them anyway instead of UDEs. Tier1 CHE from withing Lifecycle Services can be paired with Power Platform environment including Dual Write Configuration with Dynamics 365 Sales. In the Tier1 configuration I’ve encountered an authentication error at creating new records in FO.

History and symptoms ๐Ÿค’

I’ve deployed a new Tier1 Dev/Test Environment 10.0.43 to the linked Azure Subscriptions. At the configuration dialog in LCS I’ve enabled Power Platform integration based on Dynamics 365 Sandbox. Deployment took a while but succeeded. ๐Ÿ‘

After the deployment I’ve restored a AxDB database with some basic configurations and performed a full database synchronization in Visual Studio. ๐Ÿ‘

Next I’ve linked the D365 Finance and Supply Chain machine with the deployed D365 Sales Sandbox environment from the LCS environment details page. Finally I’ve enabled the Dual Write Configuration also from the LCS environment details page. ๐Ÿ‘

In Power Platform Admin Center, Environment, Dynamics 365 Apps two solutions were already deployed. Dynamics 365 Dual Write Core and Dynamics 365 Dual Write Application Core. Because the environment was a sales prototype I’ve added the Dynamics 365 Dual Write Human Resource, Dynamics 365 Dual Write Finance, Dynamics 365 HR Common Tables (!), Dynamics 365 Dual Write Global Address Book and Dynamics 365 Dual Write Supply Chain Solution from the App Source. ๐Ÿ‘

In Dynamics 365 Finance & Supply Chain at the Data Management Workspace I’ve imported the Core Solution and Supply Chain Solution. The table mappings have been populated successfully and I’ve choosen to synchronize only one legal entity (company) with Sales. ๐Ÿ‘

The basic table mappings (e.g. Legal Entities, Customer Groups, Currencies, Units, Size, Colors, etc.) including initial synchronization from FO to Sales were successful. I’ve also enabled synchronization between CustomersV3 and Accounts๐Ÿ‘

In Dynamics 365 Sales it was possible to create a new account from type customer, link it to the corresponding legal entity and assign a customer group. The customer account from Dynamics 365 Sales was successfully synchronized into FO within seconds and became a customer. ๐Ÿ‘

Vice versa, from Dynamics 365 Finance Supply Chain to Dynamics 365 Sales did not work. As soon as a new records in a synchronized table was create a Dual Write error message came up. No matter if it was a customer, product, etc. ๐Ÿคฌ

Unable to write data to entity accounts
Authentication failed with error
Failed to authenticate for https://__crm4.dynamics.com/api/data/v9.0/accounts
For information on troubleshooting see
https://go.microsoft.com/fwlink/?linkid=2244045
Unable to get access token 

Solution Step-by-Step

Microsoft has cut off CHE connections from accessing tenant information. I found some blog posts from other folks dealing with Business Events and a Microsoft documentation to setup Onebox environments. Following the instructions I’ve done the following:

App Registration

I’ve created an App Registration in EntraID. I’ve added the URL of the D365 FO Environment as response URL from type Web as well as the URL including /oauth

I gave API permissions to User.Read, User.Read.All, Dynamics ERP AX.FullAccess and LCS user_impersonation.

Self-Signed Certificate via PowerShell

Following the documentation I’ve create a simple self-signed certification via PowerShell on the Tier1 VM.

$cername = "myselfcert"

$cert = New-SelfSignedCertificate -Subject "CN=$certname" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256

Export-Certificate -Cert $cert -FilePath "C:\Users\Admin051a5b362b\Desktop\$certname.cer"

Add to root-authorities

The export generates a .cer file. By double clicking the file you can install the file on the computer. I’ve done this twice. As local machine and choosen the place to be the root certificate authorities. The same with installation as user and also selected the installation to be placed in the root certificate authorities.

web.config and wif.config

The web.config file can be found at K:\AosService\WebRoot directory. I’ve made a copy before editing and changed the following lines. The spn: is the application ID from the app registration. The thumbprint can be found in the .cer file on the Details tab.

<add key="Aad.Realm" value="spn:<your application ID>" />
<add key="Infrastructure.S2SCertThumbprint" value="<certificate thumbprint>" />
<add key="GraphApi.GraphAPIServicePrincipalCert" value="<certificate thumbprint>" />

In the wif.config I’ve also added the application ID

<?xml version="1.0"?>
<system.identityModel>
  <identityConfiguration>
    <securityTokenHandlers>
      <securityTokenHandlerConfiguration>
        <audienceUris>
       <!-- WARNING: MUST be first element; updated at web role instance startup -->
          <add value="spn:00000015-0000-0000-c000-000000000000" />
          <add value="spn:MY_APP_ID_HERE" />

Application User in Dynamics 365 Power Platform

In the Power Platform Admin Center, on the environment page, under settings in groups users I’ve added an application user. I’ve assigned System Administrator Role like the already existing Finance and Operations Service account.

EntraID Application in FO

Finally to avoid any additional problems I’ve also create a new record in Dynamics 365 Finance Supply Chain at the EntraID Applications page.

Finally the synchronization from Dynamics 365 Finance Supply Chain to Dynamics 365 Sales was working. Customers, Products, Sales Orders, Invoices etc. can be created in FO and be found Sales. ๐Ÿ’š

Dual Write Installation error missing dependencies

Since the original Dual Write Orchestration packages has been replaced by a pack of smaller packages, its needed to setup the packages in the correct order. You can deploy a Power Platform environment using the Dynamics 365 template from within LCS.

Also from the Environment Full Details page, you can trigger the installation of Dual Write solution and initial configure the Dual Write solution. You will end up with Dual Write Core and Dual Write Application Core Solutions installed. To have the full Dual Write Experience you have to install the other Packages as well.

Dynamics 365 Dual Write Packages from App Source

You might run into an installation error when deploying Dual-Write packages from App Source.

Error details
How to troubleshoot
To fix this problem retry installation. For directions on how to do this see here.

msdyn_Dynamics365SupplyChainExtendedAnchor
NotProvided
Solution manifest import: FAILURE: The following solution cannot be imported: Dynamics365SupplyChainExtended. Some dependencies are missing. The missing dependencies are : <MissingDependencies><MissingDependency canResolveMissingDependency="True"><Required type="1" schemaName="cdm_worker" displayName="Worker" solution="HCMCommon (2.2.0.3)" /><Dependent type="10" schemaName="msdyn_transferorder_receivingcontactpersonnel" displayName="msdyn_transferorder_receivingcontactpersonnel" parentSchemaName="cdm_worker" parentDisplayName="Worker" /></MissingDependency></MissingDependencies> , ProductUpdatesOnly : False
Dual Write installation error, dependencies Worker missing

There is also an additional required package that is not labeled “Dual Write” but required, the Dynamics 365 HR Common Tables package:

Dynamics 365 HR Common Tables solution required for Dual Write

Here is an order of installation that worked for me (Apr. 2025)

  • Dual Write Human Resource solution
  • Dual Write Finance Solution
  • Dynamics 365 HR Common Tables
  • Dual Write Global Address Book
  • Dual Write Supply Chain

Order Change: Notify customers using Business Events and Flow

The combination of Dynamics 365 Finance and Supply Chain Management Cloud ERP and Microsoft Flow helps to automate many tasks. For example, you can easily notify a customer about changes regarding the confirmed shipping date.

Concept

  • Configure a Blob Storage Endpoint for Business Events
  • Create a Change Based alert for ShippingDateConfirmed
  • Add a flow that triggers on new events
  • Read Email from sales order and notify customer

Configure a Business Event Endpoint

Business events in Dynamics 365 FSCM can be used to notify external systems. It supports different endpoint types like Azure Event Hub, HTTPS web hook and Blob Storage Account. I personally prefer to use a storage account because it’s a very cheap and easy to use, understand and support cloud infrastructure.

In Entra ID admin Portal (Azure Active Directory) create a new app registration. Note the client ID and create a new secret. Note the application secret as well.

In order to use a Blob Container for Business Events you need some resources. First, of course a storage account with a public endpoint. Copy the Storage Account Connection String. On the menu on the left side select Storage browser. Navigate to the Blob Storage and create a new container.

Azure Storage Account for storing business events in a blob container

Next, create a key vault to store the connection string. When creating the key vault make sure to use Vault Access Policies. At the key vault create a new secret and place the connection string there.

Azure Key Vault with Vault Access Policies

In the Key Vault switch to Access Policies and create a new one. Assign the registered app the rights to list and get secrets.

Assign List and Get secrets permissions to the service principal

In Dynamics 365 Finance and Supply Chain Management open the Business Event Catalog (System Administration > Setup > Business Events). Switch to the Enpoint tab and create a new Blob Endpoint. In the dialog provide

  • a meaningful name
  • the name of the Blob container
  • client ID from the app registration
  • client secret
  • the key vaults URI (from the key vaults overview pane)
    e.g. https://yourkeyvault.vault.azure.net/
  • the name of the secret that holds the connection string

Switch to the Business Event Catalog and filter the Business Event ID entries containing the term “Alert”. Make sure you select the BusinessEventsAlertEvent and click on Activate. In the dialog select the legal entity and the recently created blob endpoint.

Business Events Catalog in Dynamics 365 Finance and Supply Chain Management

Test Business Event Endpoint configuration for Alerts

Make sure you have a batch job handling change based alerts in Dynamics 365 Finance and Supply Chain. If you don’t have such a batch job, create on from System Administration > Periodic Tasks > Alerts > Change Based Alerts. Change the recurrence to no end date and provide a time interval e.g 10 minutes.

In Dynamics 365 FSCM go to an existing sales order or create one. In the top menu switch to Options and select Create custom alert.

Create a change based alert for sales orders in Dynamics 365 Finance Supply Chain Management

In the alert dialog choose the Confirmed Ship Date from the field drop down. This will change the alert trigger to Has Changed. Make sure to activate the Send Externally Option as well. Save the alert rule.

Create a change based alert for sales orders in Dynamics 365 Finance Supply Chain Management

Change the confirmed ship date in the sales order. Depending on the time interval for change based alerts batch job you will get notified that the value has been changed.

Alert notification that the shipping date confirmed has been changed

Switch to the Azure Portal and go to your storage account. From the Storage Browser, select the Blob Storage and the container you created for the business events. There you should see at least one entry named as GUID.

Azure storage account with business events

Download and open the file in a text editor. I should contain the JSON for the business event. You will find the Sales Order Number in KeyValue1 and the legal entity in the DataAreaId property. You can use this values to lookup the sales order in D365.

Business Event JSON text

Create a flow to notify the customer

Go to Power Automate and create a new flow that triggers when a blob entry is created or modified. Check if the event was a shipping data change and send an email to the customer. The flow may look like this.

Flow in Power Automate to handle business events

The first action Get Blob Content is used to download the event file itself. The next action will parse the event JSON string. Since the blob file has no file extension, it is necessary to provide the content as string() to the Parser. The schema can be generated by example. E.g. copy the JSON string from the test file and flow will generate the schema.

Parse JSON action for Dynamics 365 Business Events

Because the blob storage account may be used by different business events in the future it is advised to add a condition to check if the alert is triggered by the shipping date.

Condition if change based alert was triggered by changing the shipping date

Next use the DataAreaId and KeyFieldValue1 to lookup the Sales Order in Dynamics 365 FSCM by combining both values seperated with a comma e.g. demf,001234

Lookup sales order in Dynamics 365 Finance and Supply Chain from flow

Add a second condition to check if there is an Email address provided in the sales header. If so use the Send Email action to notify the customer. If required you may work on the FALSE condition and lookup the customer instead to find an email address.

Send email with the updated shipping date to the customer

The final email may look like this

Email with the new shipping date

Mirror Dynamics 365 Finance SCM Entity Database in Fabric

There are many ways to analyze Dynamics 365 Finance and Supply Chain data in Fabric. One way is to replicate the Entity Export database. Then load it into a Lake House and combine it with additional data for reporting. Here is a video that shows the complete process from scratch:

Considerations

Entity Export to Azure SQL

Exporting entities in Dynamics 365 Finance and Supply Chain to your own Azure SQL DB has been around for a long time. There are some considerations and limitations when using your BYOD (Bring Your Own Database) feature with Fabric:

  • Sizing: 100+ DTUs are required. Free and small Basic versions are not supported.
  • Networking: Public Endpoint and allow Azure Services
  • Security: System Assigned Managed Identity ON
  • Table: Column store tables are not supported
  • Table: Primary Index required

Analyze Dynamics 365 Finance / SCM Data in Synapse (Video)

Load Dynamics 365 F/SCM in Synapse and visualize in PowerBI

Synapse link for Dataverse is the replacement for Export to Datalake which has been deprecated in 2023. Although it is called link for Dataverse, it can be used to access Dynamics 365 Finance and Supply Chain tables (and CE data from dataverse as well).

Synapse link for Dataverse

SQL Track Changes has to be enabled in D365 F/SCM. Creates, Updates and Deletes are written to a storage account in CSV format. Synapse is running a spark pool that converts the CSVs into Deltalake format which relies on the (Open-Standard) Parquet format. As result you can see and query the F/SCM tables in the lake like it were tables in a relational database.

Synapse Workspace with Dynamics 365 Finance and Supply Chain data

Good news, Synpase has a serverless SQL pool and a public SQL endpoint. You can find the SQL endpoint from the Manage Icon (down left) > SQL Pools > Select your pool > Workspace SQL Endpoint.

You can create databases in this SQL pool and create views based on the tables in the datalake. For example join CustTable and DirPartyTable and provide a view that contains customer data with a name and address.

Create views in Synapse Workspace

You can use the development workspace in Synapse web based workspace but you could also use other tools to connect e.g. SQL Server Management Studio. Tables in the datalake and views in another database can be accessed.

SQL Endpoint for Synapse Workspace

PowerBI has a built-in connector for Synapse Workspace. You can easily load data from the tables and also from the views.

PowerBI Synapse connector

Version Control for PowerBI with Git

When working on PowerBI projects for a longer time or supporting a customer, version control would be a desireable feature. In many cases PowerBI files are stored on file share or SharePoint. At this moment (August 2023) there is no integrated version control feature in PowerBI Desktop. But we can use featues from different software products to build a version control strategy for PowerBI.

Version Control for PowerBI with Git

There is a preview feature call “Save as PowerBI project” in PowerBI Desktop. This will split the PowerBI report into multiple files that contain the model definition, the report layout and some more files. Now that we have multiple files in a project folder, one can come up with the idea to put these files under a version control system.

You can use Git as version control system on your local PC or wherever the PowerBI reports are developed. Git has a local repository and can be connected to central repository. In Azure DevOps you can setup projects using Git as version control system. Connect your local PowerBI Git repository with Azure DevOps to manage your PowerBI report development.

Here you can read the original post from Microsoft: https://powerbi.microsoft.com/en-us/blog/deep-dive-into-power-bi-desktop-developer-mode-preview/

I’ve made a video that shows you how to setup version control and connect PowerBI with DevOps:

Data I/O Cheat Sheet for Dynamics 365 F/SCM

There are many ways how to access, import and export data in Dynamics 365 Finance & Supply Chain Management. Find here a one page PDF summary of 12 common ways. Every solution has it pros and cons.

Admin Provisioning Tool Error: The value’s length for key ‘password’ exceeds it’s limit of ‘128’

Microsoft has recently released the new VHD for Dynamics 365 Finance and Operations 10.0.24 to download from LCS. When you instantly try to execute the Admin Provisioning Tool and provide your domain user you will get an error.

The value’s length for key ‘password’ exceeds it’s limit of ‘128’

There are some steps required before you can assign it to your Domain:

  1. Go to https://portal.azure.com > Active Directory > App Registration and register a new App
Register a new application in Azure Active Directory
  1. Provide the One-Box URL as reponse address
  2. Copy the AppID to your clipboard
App Registration for Dynamics 365 FO 10.0.24 One-Box Environment
  1. Execute the “Generate Self-Signed Certificates” PowerShell script from the Desktop
  2. Provide the AppID from the App Registration
Generate Self-Signed Certificates

6. Execute the Admin Provisioning Tool and provide your Domain account address
7. Open https://usnconeboxax1aos.cloud.onebox.dynamics.com/ in Edge and login

Dynamics 365 Finance and Operations 10.0.24 One-Box Environment

Store Dynamics 365 F/SCM Attachments on SharePoint

Dynamics 365 Finance & Supply Chain allows you to store attachments in the database, on Azure BLOB and on SharePoint. See here how you can configure SharePoint as storage location and create a document type “Fact Sheet”: