The first lock-down in spring was a game changer. Home office has become normal. But also in non-lock-down times the usage of Microsoft Teams now is on a high volume. For us it has become the collaboration backbone.
Teams usage is stable no matter if lock-down
Webcast Marathon
Last week on 26th November we held a webcast marathon. We presented 6 talks for Dynamics 365 Finance and Power Platform. The webcasts were organized using Microsoft Teams Live Events. The registration pages were quickly built using Microsoft Forms and automated using Power Automate. When a new participant registered for a webcast, PA read the registration from Form, picked the corresponding .ics calendar, sent a mail to the participant with the calendar .ics as attachment and created a new record in Azure table storage.
Webcast Registration via Power Automate
It turned out that the combination of Office 365, Power Platform and Teams work great together and allows to manage a complex distributed scenario like a webcast with different speakers exclusivly in the cloud.
Lock-down results
In contrast to the very strict first lock-down in spring, the actual “hard” lock-down allows more exceptions.
Monday morning: Some stay at home but many (have to) go to work (30th November)Monday morning : First week of first lock-down in spring (16th March)
In theory we have a curfew all day. Schools and Universities are closed, many retail stores are closed, home depots are closed for B2C customers and gastronomy is limited to delivery. But since going to work, sport and shopping is allowed we don’t really feel the pressure like in spring. However, it seems to work and the numbers are dropping.
Taking backups is crucial. I prefer to use the Azure cloud for storing backups. In case a disaster strikes on-premises, the data is at least save in the cloud. Microsoft is offering a great solution with Azure Backup. For taking simple file-based backups you only need the recovery agent installed on the source server. For taking more complex backups e.g. from SQL Server and HyperV the Azure Backup Server is required.
Azure Backup Server (aka. DPM)
The Azure Backup Server is a re-branded System Center Data Protection Manager. Backups can be stored locally on disk and in an Azure Backup Vault. Like the SCDPM Server, the Azure Backup Server requires agents to be installed on the source systems. This can be done using push or pull techniques. Within a domain you can instruct the DPM Server to install an agent on a server. You may also install the agent by hand and instruct DPM to connect to an already installed agent.
Azure Backup Server Console
I had one legacy server hosting a SQL Server 2012 instance, which was protected with System Center Data Protection Manager 2012 a while ago. The old agents were uninstalled years ago but left some entries that blocked the installation of the new DPM agent.
Identifying the problem
When the installation fails, a Log is created in C:\Windows\Temp. A look in the log file revealed that the installer found an installed product that should not be installed.
Agent installation started
The agent bootstrapper is doing prerequisite checks
Querying for Product with Upgrade code: {0BEE7F6A-CE2A-A5CF-FFEB-8E0F8A8CDE75}
Querying for Product with Upgrade code: {EFF053DE-592F-5574-9AA3-64662A944952}
IsProductInstalled: MsiEnumRelatedProducts returned ERROR_SUCCESS and product code found is {EECBB752-2C6E-45B7-9F18-2327B886309A}
IsProductInstalled: Product: {EECBB752-2C6E-45B7-9F18-2327B886309A} is installed
PerformAgentInstall failed with errorcode=addfd060
Install ProtectionAgent failed with errorcode=addfd060
Failed: Hr: = [0x80990a2d] DPMAgentInstaller failed, error says: [(null)]
Failed: Hr: = [0x80990a2d] : SC-DPMRA found. Cannot install Microsoft Azure Backup Agent
Failed: Hr: = [0x80990a2d] : Encountered Failure: : lVal : PerformAgentInstall(installargs, silent, skipKB)
Failed: Hr: = [0x80990a2d] : Encountered Failure: : lVal : InstallProtectionAgent(false , false )
To identify the problem get_wmiobject can be used to display ID and Name. A old version of System Center Data Protection Manager Agent was not removed properly.
A first attempt to get rid of the DPM 2012 was to clean the registry. Therefore the regedit.msc was called and all entries referencing {EFF053DE-592F-5574-9AA3-64662A944952} were deleted. This was not sufficient to install the new agent.
Microsoft provides a tool to remove entries from uninstalled programs. The tool MicrosoftProgram_Install_and_Uninstall.meta.diagcab can be downloaded here: Fix problems that block programs from being installed or removed . It found the entry for DPM 2012 and removed it.
Fixit for blocking installation / uninstallation
The tool was a great step in the right direction, however the installation failed again because the DPM service could not be installed. The log file showed the following entry:
Received type [0x01000000] message [Service 'DPM CPWrapper Service' (DpmCPWrapperService) could not be installed. Verify that you have sufficient privileges to install system services.]
It turned out that there was already a CPWrapper Service but it was not functional anymore. The path to binary was no longer working. Therefore the property dialog from the service MMC was also not working. But there exists a tool to remove corrupt service entries. Process Hacker can be used to simple delete the service entry.
Dynamics 365 Finance / SCM Tier1 sandbox environments are heavily used by partners for development and building Dynamics 365 Finance / SCM applications. Microsoft-hosted Tier 1 environments were a great deal because we got well sized VMs with 28 GB RAM and 4 Cores plus SQL Server, Visual Studio and Dynamics 365 Finance pre-installed for a very small fixed price per month available 24/7. Now Microsoft recently announced that they will no longer include Microsoft-Hosted Tier1 Sandbox environments with the Dynamics 365 Financen / SCM license and we will no longer be able to purchase additional Tier1 sandbox Addons. The preferred solution is to use Cloud-Hosted environments instead.
No more Microsoft-Hosted Tier1 environments
Microsoft-Hosted vs. Cloud-Hosted
From a technical standpoint there is no difference between a Microsoft-Hosted or a Cloud-Hosted environment. Both solutions deploy a Windows Server VM in Azure. In both cases the deployment is managed via Lifecycle Service (LCS).
LCS management of a Cloud-Hosted environmentArtefacts of a Cloud-Hosted Dynamics 365 FO Tier 1 environment in Azure
However, there are 3 major aspects to consider:
One big difference is the pricing model. Microsoft-Hosted environments (or Addons) come with a fixed (!) price per month while Cloud-Hosted environments deploy on an Azure subscription and therefore are billed like a classic IaaS (aka. virtual Machine in Azure). Make sure to calculate the costs (!) and turn of the environments if not needed.
Another difference is the ability to choose the sizing of the deployed environment. In contrast to Microsoft-Hosted Tier 1 environments, you are now free to choose a sizing that fits your needs e.g. more (or less) RAM, CPU, Premium SSD storage, etc.
Moreover, in contrast to Microsoft-Hosted environments, we now get an Admin account on our machines. It was understandable that Microsoft tried to lock down the cheap VMs to prevent the misusage with anything else then Dynamics 365. Since we own and pay the VM in a Cloud-Hosted environment its more than fair to have Admin access on the machine.
Video: How to deploy a Cloud-Hosted Dynamics 365 Tier1 developer VM
Make sure to visit my Youtube channel and watch how to deploy a Cloud-Hosted Dynamics 365 FO developer VM using an Azure Subscription.
Update: Management Certificates
The use of management certificates is not supported when using a CSP Azure Subscription. Use a user-connection instead.
Update: Provisioning Admin User
Please note that the user deploying the environment is provisioned as the administrator. Microsoft-Hosted environment had to be signed off using a user from the customer tenant. I’d recommend to stick to this process when deploying a Cloud-Hosted environment
At work we recently discussed ways to startup Dynamics AX 2012 and navigate to a specific record. The requirement was to open Dynamics AX from a DMS client that manages invoices and other documents.
There are different approaches to achieve this goal. One way is to use the Startup Command framework which is used to instruct Dynamics to execute several functionalities during startup e.g. compile, synchronize or navigate to a menu item. In order to startup a menu item, you provide an XML file which contains the menu item name and point to this file from the .axc Dynamics AX configuration file.
Startup Dynamics AX 2012 with an XML configuration file
Reference the record in the startup XML file
For many forms in Dynamics AX it is sufficient to call the corresponding menu item with an Args object that holds the record. To specify a record in Dynamics AX you need to provide at least the TableId and the RecId. For example the Customer “Adventure Works” can be defined by using TableId 77 (CustTable) and the RecId 22565422070. Add two additional attributes RecId and TableId to the XML file which is used to open the CustTable form. The XML file looks like this:
At the SysAutoRun class, open the execRun() method. At the top declare the following variables:
RecId recId; TableId tableId; Args arg = new Args(); Common common; DictTable dictTable;
At the bottom, find the place where a menu item is started. Before the if(mf) statement add the following code to read the RecId and TableId from the XML file and select the corresponding record:
recId = str2int64(this.getAttributeValue(_command,'RecId'));
tableId = str2int(this.getAttributeValue(_command,'TableId'));
if(recId != 0)
{
dictTable = new DictTable(tableId);
common = dictTable.makeRecord();
select common where common.RecId == recId;
arg.record(common);
}
Within the if(mf) block, add the Args object when the menu fuction is called to pass the record.
mf = new MenuFunction(name, menuItemType);
if (mf)
{
this.logInfo(strfmt("@SYS101206", mf.object(), enum2str(mf.objectType())));
mf.run(Arg);
result = true;
}
Test your configuration
Now you can test your configuration. Create a new .axc file and point it to the XML file. Make sure the XML file has a valid TableId and Recid property. Start Dynamics AX using the .axc file and the defined menu item should open and view the record.
Power Automate (aka. Microsoft Flow) is a great cloud-based tool to automate all possible tasks. There is a Ethereum connector (Beta) that can be used to deploy a Smart Contract to an Ethereum Blockchain network and execute functions.
Ethereum Network
You need to connect to an Ethereum network. There is a fully managed Blockchain Service in Azure. I’m running my private network with Proof-of-Authority. At the Azure management portal, go to the transaction node to get the required information to connect.
Blockchain-as-a-Service in Azure
Smart Contract
I’m using Visual Studio Code with the Ethereum Blockchain Development SDK to implement a Smart Contract in Solidity. You can find the link to the SDK at the Azure Blockchain Service portal.
VS Code with Azure Blockchain SDK
The SDK requires a lot of other software products to download and install. I found that the solidity compiler installed was newer than expected. As result the demo smart contract you get from the SDK did not compile. The simplest solution was to change the pragma of the contract
pragma solidity >= 0.5.16 <= 0.7.0;
ABI and Bytecode
In order to automate the deployment of a Smart Contract via Power Automate you need to provide the ABI and Bytecode. Both can be found in VS Code, at the build directory in the context menu.
Copy ABI and Bytecode directly from VS Code
Power Automate
You can directly provide the ABI and Bytecode in the Deploy Smart Contract action. However, I decided to place both in an Azure Table Storage and fetch it from there. To do so, I create a table with a column for the Bytecode, a column for the ABI and a name.
The first step in my flow is to connect to the Azure Storage account and get the smart contract I need. The result from the storage is a JSON string which is parsed so the ABI and Bytecode is available for the next steps
Fetch the smart contract binaries from the storage account
Next the Deploy Smart Contract action is used to deploy the contract. There you need to provide the connection to your Ethereum network. In my example there are two parameters for the constructor and for testing purpose these values are hardcoded. In real life you would provide values from the calling sources. The third parameter for the connector action requires the Bytecode which is taken from the storage account. The result from the deployment is the smart contracts address which is stored in a flow variable.
Deploy a smart contract
Interacting with the smart contract
After the Smart Contract has been deployed to the Ethereum Blockchain Network, use the Execute Smart Contract Function action in the flow. For each step you have to provide the address, the ABI, the name of function and the parameter as JSON string. A function without parameters has to be called with {} because the parameter property is mandatory.
Execute a Smart Contract function via Power Automate / Flow
Here is an example for a function with some parameters. These parameters have to be provided as JSON string in Flow.
function SetupMachine(int sawLength,
int waterTemp,
int rpm,
int speed) public
{
if (State != StateType.Assigned)
{
revert('Assign to a machine first');
}
SawLengthMM = sawLength;
WaterTempDgrC = waterTemp;
ExtruderRPM = rpm;
ExtruderSpeed = speed;
State = StateType.Setup;
Worker = msg.sender;
}
UI Flows are new features from power platform april 2020 wave, and allow you to integrate local installed applications and web sites. I’ve made a video with UI Flow for web sites. In this demo I’ve create a flow that reads the Bitcoin / Euro rate from a web sites and sends it per Email.
Azure Functions are simple way to pack and provide business logic as web service without worrying about hosting a web server. Azure Functions can be implemented in different programming languages like C#, JavaScript, PHP, Java, etc. and can be hosted on Linux and Windows with different runtime environments that feed your need.
In the Azure Portal click + Create a resource and search for Function App:
In the next screen choose a subscription and create a resource group (or use an existing one if you like). Provide a useful name and choose code as Publish method. Select .NET Core 3.1 as runtime stack and a region that is near your location:
Click Review + Create to create the Azure Function. It takes a view minutes to provision all the required elements:
Click on Go to Resource. Next to the Functions group click + to create a new function and select In-Portal to edit the function code direct in the browser:
Choose the webhook + API to create a demo function that can be called via HTTP POST.
This will create a function that takes a name as parameter and returns “Hello ” + the parameter name.
You can test the function by using Test tab on the right. The function takes a JSON string with a name parameter and returns a simple string.
Call the function from X++
In the azure portal get the function URL with a function key. Copy the URL with the key:
In Visual Studio create an X++ class with a main method for testing. Use the System.Net.Http.HttpClient class to call the service. The content is a JSON string encoded in UTF-8 with a name parameter and value. In this example the name is Dynamics:
System.Net.Http.HttpClient httpClient = new System.Net.Http.HttpClient();
System.Net.Http.HttpContent content = new System.Net.Http.StringContent(
"{\"name\":\"Dynamics\"}",
System.Text.Encoding::UTF8,
"application/json");
At the moment X++ does not support the await keyword for asynchronouse calls. The workaround is to use the Task.Wait() method. Call the service with your function URL async and get the content of the call:
var task = httpClient.PostAsync("https://<YOUR_FUNCTION_URL>",content);
task.Wait();
System.Net.Http.HttpResponseMessage msg = task.Result;
System.Net.Http.HttpContent ct = msg.Content;
var result = ct.ReadAsStringAsync();
result.Wait();
System.String s = result.Result;
info(s);
Start the class from Visual Studio. The result should look like this:
In Dynamics 365 Finance / SCM we can no longer access the SQL database of the production environment directly. However, we can access the SQL database of the Acceptance Test instance. All required information can be found in LCS. I’ve made a video where to find this information in LCS and how to connecto to the SQL database.
Recently, a customer using Dynamics AX 2009 implemeted a web service that access a view directly in SQL Server. Therefore they created a new SQL user login and gave the user read permissions on the view.
Read permission on a view
However, when synchronizing the data dictionary in Dynamics AX 2009, the views are droped and recreated and the permission on the object is lost. Therfore the webservice call fails.
One way to address this issue from a SQL perspective is to create a DDL trigger that sets the permissions on the view programmatically. Here is a small SQL script that sets read permissions for the user view_user on the the DIRPARTYVIEW after the view has been created again.
CREATE TRIGGER [VIEW_PERMISSION]
ON DATABASE
FOR CREATE_VIEW
AS
BEGIN
DECLARE @name SYSNAME
SELECT @name = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','SYSNAME')
if @name = 'DIRPARTYVIEW' begin
GRANT SELECT ON [dbo].[DIRPARTYVIEW] TO [view_user]
end
END
GO
ENABLE TRIGGER [VIEW_PERMISSION] ON DATABASE
GO