Call an Azure Function from X++ in Dynamics 365 Finance / SCM

Create an Azure Function

Azure Functions are simple way to pack and provide business logic as web service without worrying about hosting a web server. Azure Functions can be implemented in different programming languages like C#, JavaScript, PHP, Java, etc. and can be hosted on Linux and Windows with different runtime environments that feed your need.

In the Azure Portal click + Create a resource and search for Function App:

Create a Azure Function App

In the next screen choose a subscription and create a resource group (or use an existing one if you like). Provide a useful name and choose code as Publish method. Select .NET Core 3.1 as runtime stack and a region that is near your location:

Configure the Azure Function App to use .NET Core 3.1

Click Review + Create to create the Azure Function. It takes a view minutes to provision all the required elements:

Deploy the Azure Function App

Click on Go to Resource. Next to the Functions group click + to create a new function and select In-Portal to edit the function code direct in the browser:

Create a new HTTP trigger

Choose the webhook + API to create a demo function that can be called via HTTP POST.

Use webhook for the Azure Function

This will create a function that takes a name as parameter and returns “Hello ” + the parameter name.

C# Azure Function code

You can test the function by using Test tab on the right. The function takes a JSON string with a name parameter and returns a simple string.

Test the Azure Function with a JSON string

Call the function from X++

In the azure portal get the function URL with a function key. Copy the URL with the key:

Copy the Azure Function URL with function key

In Visual Studio create an X++ class with a main method for testing. Use the System.Net.Http.HttpClient class to call the service. The content is a JSON string encoded in UTF-8 with a name parameter and value. In this example the name is Dynamics:

System.Net.Http.HttpClient httpClient = new System.Net.Http.HttpClient();
System.Net.Http.HttpContent content = new System.Net.Http.StringContent(

At the moment X++ does not support the await keyword for asynchronouse calls. The workaround is to use the Task.Wait() method. Call the service with your function URL async and get the content of the call:

var task = httpClient.PostAsync("https://<YOUR_FUNCTION_URL>",content);
System.Net.Http.HttpResponseMessage msg = task.Result;

System.Net.Http.HttpContent ct = msg.Content;
var result = ct.ReadAsStringAsync();
System.String s = result.Result;


Start the class from Visual Studio. The result should look like this:

Call the Azure Function from Dynamics 365 Finance

Connect to the SQL database of a Dynamics 365 Finance Test instance

In Dynamics 365 Finance / SCM we can no longer access the SQL database of the production environment directly. However, we can access the SQL database of the Acceptance Test instance. All required information can be found in LCS. I’ve made a video where to find this information in LCS and how to connecto to the SQL database.

Using SQL DDL Triggers to restore read Permissions programatically after Dynamics AX 2009 Synchronization

Recently, a customer using Dynamics AX 2009 implemeted a web service that access a view directly in SQL Server. Therefore they created a new SQL user login and gave the user read permissions on the view.

Read permission on a view

However, when synchronizing the data dictionary in Dynamics AX 2009, the views are droped and recreated and the permission on the object is lost. Therfore the webservice call fails.

One way to address this issue from a SQL perspective is to create a DDL trigger that sets the permissions on the view programmatically. Here is a small SQL script that sets read permissions for the user view_user on the the DIRPARTYVIEW after the view has been created again.

    SELECT  @name = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','SYSNAME')

    if @name = 'DIRPARTYVIEW' begin
        GRANT SELECT ON [dbo].[DIRPARTYVIEW] TO [view_user]


Dynamics 365 FO: Export Entity Store to Azure Data Lake

Since version 10 Dynamics 365 for Finance and Operations supports the entity store export to Azure data lake. The main benefits are reduced costs because Azure Cloud storage is cheap and easy access for Business Intelligence tools like PowerBI.

If you are running a local development VM, the data connection tab in system parameters ist deactived by default. However, this can be actived using the SysFlighting table.

The configuration is pretty well documented by Microsoft. I’ve performed all the necessary steps and recorded a video:

Connect Azure Data Lake Storage with PowerBI dataflow

PowerBI dataflow performs ETL (Extract Transform Load) workloads in the cloud. PowerBI Pro and Premium Users get dataflow storage without additional charges. However, this storage is managed by PowerBI and you cannot access it directly. Therefor BYOSA (Bring Your Own Storage Account) is support to connect you own Azure storage account with PowerBI dataflow. I’ve made a video, following the documentation, how to connect an Azure storage account with PowerBI. Please find my video youtube:

Configure Azure Data Lake storage with PowerBI dataflow

Configure PowerBI on Dynamics 365 FO developer VM

I’ve created a video tutorial how to configure PowerBI on a stand alone Dynamics 365 Finance and Operations developer VM

Setup multiple developer VMs for Version Control with Team Services in Dynamics 365 for Finance and Operations

Here is a walkthrough how to connect two Dynamics 365 Finance and Operations developer VMs with VSTS.

Configure Azure AD and setup Visual Studio Team Services

Before you start, you have to add your developers to Azure Active Directory. If you have Azure AD Connect make sure the accounts have been synced to the Cloud. In my case I added two additional users to my Azure AD.

Configure Developer Accounts in Azure AD

Next logon to your Azure Portal using your Organization Account. Create a new service and search for “Team Services”. You should find Visual Studio Team Services (preview).

Create Visual Studio Team Services project in Azure Portal

Create a new instance of VSTS. The basic version of VSTS for 5 users is free. Make sure to use Team Foundation Server as Version Control system. You may choose any Methodology you like, but II ‘d recommend to go for CMMI compatible one.

Create Visual Studio Team Services project in Azure Portal

After the deployment succeeded, logon to your Visual Studio Team Services account, using  the URL https://<ACCOUNTNAME&gt; . There you see a new project. Open the project and add your developer users by clicking the (+) icon on the upper left “Members” pane. If everything is configured correctly, you should be able to add your AD users. In my example developer one and developer two.

Add developer accounts to Dynamics 365 FO project

Configure Developer VMs

If you are using the VHD Images from Microsoft, the first thing you should do is to rename the computer. If you don’t rename the VMs you will get errors when mapping the source control on multiple machines. In my case I rename the VMs to “devbox1” and “devbox2”. No domain is needed. Details here.

Rename Dynamics 365 FO developer VM

Configure first Developer Box

After the VM reboots, open Visual Studio 2015 in Admin mode. Depending on your licensing you may need to provide additional credentials to use your subscription. Don’t get confused if this may be your Microsoft ID (aka. Live ID) while you need your Organization Account to access VSTS. Zwinkerndes Smiley  From the menu bar select > Team > Manage Connections. Provide the URL for your VSTS account.

Connect to Visual Studio Team Services

After you have connected to VSTS select the project to work on. Next, from the Team Explorer open the Source Control explorer. Map the root folder to a folder on your developer VM.

Map Source Control Folder in Visual Studio

Afterwards use the source control explorer to create two new folders. One for Visual Studio Projects and one for metadata. This is where the D365 source code artifacts will be stored. Check in you pending changes. This will sync the folders to VSTS.

Map Dynamics 365 FO metadata folder

Now, in the Source Control Explorer open the dropdown Workspace and edit your workspace. Map the metadata folder to C:\AOSService\PackagesLocalDirectory.

Map Dynamics 365 FO metadata folder

From the menu bar > Dynamics 365 > Model Management > Create a new model. Give it a name and complete the wizard. This will ask you to create a new Dynamics X++ project.

Create new Dynamics 365 FO project

In the solution explorer, right click on the project and add to source control.

Check in to Source Control

Add a new element to the project, for example add a new String Extended Datatype called DMOCustomerName. In the solution explorer, right click the project and select build. After a few seconds you should see the console output “Build completed”. Check in all your pending changes.

Next, from the Team Explorer open the Source Control Explorer. You should see the model structure in the tree view. Right click on the metadata folder and select “Add items to folder”. Navigate to your model folder and there to the Descriptor folder. Add the Model Descriptor XML file. Unfortunately you have to do this manually, otherwise the second developer can sync the Folders and Extended Datatypes etc. but will not see the model in the AOT.

Add Dynamics 365 FO Model Descriptor File to Source Control

You can also inspect your code in VSTS

Dynamics 365 FO X++ Source Code

Configure second Developer Box

Make sure that the second VM is properly renamed. Open Visual Studio in Admin mode and connect to VSTS. Logon with the account of the second developer. Select the Dynamics 365 project and again in the Source Control Explorer map the metadata folder to C:\AOSService\ PackagesLocalDirectory. Checkout the latest version of the metadata folder.

Get Latest Version from metadata folder

This will create the model folder in the packages directory.

Model folder created in PackagesLocalDirectory

In Visual Studio open the Dynamics 365 Application Explorer. If the AOT is in classic mode, right click and switch to model view. Scroll down and you we will see the synchronized model and the its software artifacts.

Model in Dynamics 365 FO Application Explorer

Setup und Configure Office 365 Project Web App

Here you can find my video on how to setup Project Web App, link with Project Desktop client and how to setup a SharePoint subwebsite with a synchronized task list


Inplace Upgrade of Windows Server 2012 HyperV to 2016

There is no upgrade path for Windows Server 2012 (non-R2) HyperV role to Windows Server 2016 HyperV. The easiest way would be to setup a new Windows Server 2016 and activate HyperV. However, there may be reasons to perform an inplace upgrade. This is an experience report doing so

Prepare for the Upgrade

First make sure there are no virtual machine related files left on C: drive. Move the configuration files and especially the virtual hard disk files to another drive. I recommend to make list where the VHDX files are place. This can easily be done via PowerShell

Get-VM –VMName * | Select-Object VMid | Get-VHD | select Path | ft

Next uninstall the HyperV role using the Server Manager. Simply select remove roles and features and select the HyperV role. This step requires a reboot and your server will come up a simple plain windows 2012 server.


After HyperV was uninstalled make sure to remove NIC teaming in case you are using this feature. I recommend to deactivate all but one network adapter. This can also be done using the Server Manager.

Perform Inplace Upgrade

Insert a disk or mount the Windows Server 2016 image. Start the upgrade process. When asked choose to keep all your data and apps. This will preserve your data and applications e.g. the RAID Manager software. The setup wizard warns you that an inplace upgrade is not the preferred way to setup server 2016. Accept and proceed. The upgrade will take a while and requires some reboots.


Setup HyperV on Server 2016

After the upgrade process has finished, its time to setup HyperV again. First configure the NIC teaming again. Activating the HyperV role in Server 2016 is almost the same as in Server 2012. This can be done by using the Server Manager installing new roles and features. Activating HyperV will require a reboot. After the reboot configure a virtual switch so your VMs can access the network again.

Import the Virtual Machines again

By default HyperV on Server 2012 has no clue about the VMs in former Server 2012. The virtual machines have to be imported manually. This can be done using the HyperV console. First, provide the folder where the virtual machine configuration is placed. Afterwards choose to directly register the VM.


Next provide the folder where the virtual hard disk files are placed. Don’t get confused because you don’t see the actual VHDX files in the selection dialog. The import wizard will check if the virtual hard disk named in the configuration file can be found in this folder. If the wizard can’t find the virtual hard disk file, take a look a the list of vhdx file paths generated by the PowerShell script.


When all the VMs are imported the upgrade is almost finished. The the network connection of the virtual machines and make sure they can access the network via the newly created virtual switch. Removing the virtual network adapter and adding the virtual network adapter again can help Zwinkerndes Smiley  Perform some tests, maybe reboot the server again and install the latest updates.

Performance Optimization by using Included Columns and Field Selects

Since version 2012, Dynamics AX supports included columns in indices although SQL Server supports it  for quite a long time. Here are some examples how and why it is good practice to use included columns in an index. I’m using Dynamics AX 2012 R3 Cu12 on Windows Server 2016 and SQL Server 2016 with Contoso Demo data for this example

Cluster Index.

The cluster index can be defined using multiple fields and is used to defined the order of records stored in the table. Even more important is the fact, that if a table has a clustered index all the data is stored in the table, i.e. the cluster index IS the table!


Take a look at the space allocated by the indices. About 219 MB are used to store actual data and 167 MB are used to store index information


The following SQL Statement reveals the size in detail

SUM(s.[used_page_count]) * 8 AS IndexSizeKB
sys.indexes ind
sys.tables t ON ind.object_id = t.object_id
sys.dm_db_partition_stats AS s ON s.[object_id] = ind.[object_id]
AND s.[index_id] = ind.[index_id]
order by IndexSizeKB desc

The table data is stored in the TransOriginIdx

name IndexSizeKB
I_177TRANSORIGINIDX    226992 ~ 221 MB
I_177ITEMIDX 24872
I_177RECID 23416
I_177DIMIDIDX 22192

Index Usage with Field Select

Here is an example of a select statement with field select on the InventTrans table

while select ItemId,DatePhysical
from inventTrans
InventTrans.ItemId == ‘0001’ &&
inventTrans.DatePhysical >= str2Date(‘1.1.2011’,123)

{ .. }

The trace parser reveals the actual SQL Statement sent to the database


What happens is what you would expect, SQL uses the ItemIdx for this query


Only 5 logical reads where necessary



Select Non-Index fields

When the query selects fields which are not part of the index, SQL server has to perform a lookup in the Cluster Index for each record identified by the ItemIdx to get all the other fields. For example the Voucher and Qty are not part of the ItemIdx.


213 logical reads were necessary to fetch the data


This can get even worse, when performing the lookup becomes to expensive. This can happen when the query returns a larger number of records. For example, when querying for another ItemId. In this example SQL server does not use the ItemIdx anymore, but performs a search in the clustered index instead. The ItemIdx became completely useless for this query.


SQL server required 1345 logical reads to fetch the data!



Included Columns

Since version 2012 Dynamics AX supports the definition of Included Columns for indices. These columns are not used to sort the index. These are just fields which are stored within the index to avoid costly lookups in the clustered index. In Dynamics AX you just add columns to the index and set the property IncludedColumn to Yes.


You can find the included columns in SQL server when viewing the properties of the index


When the statement from above is executed again, SQL server can use the included columns from the index and does not perform costly lookups in the clustered index.


Only 6 logical reads are required to fetch the data. This is a huge optimization compared to the 1345 reads without included columns.