Power Automate: Deploy and Execute an Ethereum Smart Contract

Power Automate (aka. Microsoft Flow) is a great cloud-based tool to automate all possible tasks. There is a Ethereum connector (Beta) that can be used to deploy a Smart Contract to an Ethereum Blockchain network and execute functions.

Ethereum Network

You need to connect to an Ethereum network. There is a fully managed Blockchain Service in Azure. I’m running my private network with Proof-of-Authority. At the Azure management portal, go to the transaction node to get the required information to connect.

Blockchain Service in Azure
Blockchain-as-a-Service in Azure

Smart Contract

I’m using Visual Studio Code with the Ethereum Blockchain Development SDK to implement a Smart Contract in Solidity. You can find the link to the SDK at the Azure Blockchain Service portal.

Blockchain Development Kit for VS Code
VS Code with Azure Blockchain SDK

The SDK requires a lot of other software products to download and install. I found that the solidity compiler installed was newer than expected. As result the demo smart contract you get from the SDK did not compile. The simplest solution was to change the pragma of the contract

pragma solidity >= 0.5.16 <= 0.7.0;

ABI and Bytecode

In order to automate the deployment of a Smart Contract via Power Automate you need to provide the ABI and Bytecode. Both can be found in VS Code, at the build directory in the context menu.

Solidity ABI and Bytecode
Copy ABI and Bytecode directly from VS Code

Power Automate

You can directly provide the ABI and Bytecode in the Deploy Smart Contract action. However, I decided to place both in an Azure Table Storage and fetch it from there. To do so, I create a table with a column for the Bytecode, a column for the ABI and a name.

Store ABI and Bytecode in Azure Storage Account

The first step in my flow is to connect to the Azure Storage account and get the smart contract I need. The result from the storage is a JSON string which is parsed so the ABI and Bytecode is available for the next steps

Deploy Ethereum Smart Contract from Flow
Fetch the smart contract binaries from the storage account

Next the Deploy Smart Contract action is used to deploy the contract. There you need to provide the connection to your Ethereum network. In my example there are two parameters for the constructor and for testing purpose these values are hardcoded. In real life you would provide values from the calling sources. The third parameter for the connector action requires the Bytecode which is taken from the storage account. The result from the deployment is the smart contracts address which is stored in a flow variable.

Deploy a Smart Contract to a private Ethereum Blockchain Network
Deploy a smart contract

Interacting with the smart contract

After the Smart Contract has been deployed to the Ethereum Blockchain Network, use the Execute Smart Contract Function action in the flow. For each step you have to provide the address, the ABI, the name of function and the parameter as JSON string. A function without parameters has to be called with {} because the parameter property is mandatory.

Execute a Smart Contract Function via Flow
Execute a Smart Contract function via Power Automate / Flow

Here is an example for a function with some parameters. These parameters have to be provided as JSON string in Flow.

function SetupMachine(int sawLength, 
                      int waterTemp, 
                      int rpm,
                      int speed) public
    {
        if (State != StateType.Assigned)
        {
            revert('Assign to a machine first');
        }

        SawLengthMM = sawLength;
        WaterTempDgrC = waterTemp;
        ExtruderRPM = rpm;
        ExtruderSpeed = speed;

        State = StateType.Setup;
        Worker = msg.sender;
    }
Execute a Smart Contract Function via Flow
Execute a Smart Contract function with parameters

Use UI Flows in Power Automate to interact with a web site

UI Flows are new features from power platform april 2020 wave, and allow you to integrate local installed applications and web sites. I’ve made a video with UI Flow for web sites. In this demo I’ve create a flow that reads the Bitcoin / Euro rate from a web sites and sends it per Email.

Call an Azure Function from X++ in Dynamics 365 Finance / SCM

Create an Azure Function

Azure Functions are simple way to pack and provide business logic as web service without worrying about hosting a web server. Azure Functions can be implemented in different programming languages like C#, JavaScript, PHP, Java, etc. and can be hosted on Linux and Windows with different runtime environments that feed your need.

In the Azure Portal click + Create a resource and search for Function App:

Create a Azure Function App

In the next screen choose a subscription and create a resource group (or use an existing one if you like). Provide a useful name and choose code as Publish method. Select .NET Core 3.1 as runtime stack and a region that is near your location:

Configure the Azure Function App to use .NET Core 3.1

Click Review + Create to create the Azure Function. It takes a view minutes to provision all the required elements:

Deploy the Azure Function App

Click on Go to Resource. Next to the Functions group click + to create a new function and select In-Portal to edit the function code direct in the browser:

Create a new HTTP trigger

Choose the webhook + API to create a demo function that can be called via HTTP POST.

Use webhook for the Azure Function

This will create a function that takes a name as parameter and returns “Hello ” + the parameter name.

C# Azure Function code

You can test the function by using Test tab on the right. The function takes a JSON string with a name parameter and returns a simple string.

Test the Azure Function with a JSON string

Call the function from X++

In the azure portal get the function URL with a function key. Copy the URL with the key:

Copy the Azure Function URL with function key

In Visual Studio create an X++ class with a main method for testing. Use the System.Net.Http.HttpClient class to call the service. The content is a JSON string encoded in UTF-8 with a name parameter and value. In this example the name is Dynamics:

System.Net.Http.HttpClient httpClient = new System.Net.Http.HttpClient();
System.Net.Http.HttpContent content = new System.Net.Http.StringContent(
        "{\"name\":\"Dynamics\"}",
        System.Text.Encoding::UTF8,
        "application/json");

At the moment X++ does not support the await keyword for asynchronouse calls. The workaround is to use the Task.Wait() method. Call the service with your function URL async and get the content of the call:

var task = httpClient.PostAsync("https://<YOUR_FUNCTION_URL>",content);
task.Wait();
System.Net.Http.HttpResponseMessage msg = task.Result;

System.Net.Http.HttpContent ct = msg.Content;
var result = ct.ReadAsStringAsync();
result.Wait();
System.String s = result.Result;

info(s);

Start the class from Visual Studio. The result should look like this:

Call the Azure Function from Dynamics 365 Finance

Connect to the SQL database of a Dynamics 365 Finance Test instance

In Dynamics 365 Finance / SCM we can no longer access the SQL database of the production environment directly. However, we can access the SQL database of the Acceptance Test instance. All required information can be found in LCS. I’ve made a video where to find this information in LCS and how to connecto to the SQL database.

Using SQL DDL Triggers to restore read Permissions programatically after Dynamics AX 2009 Synchronization

Recently, a customer using Dynamics AX 2009 implemeted a web service that access a view directly in SQL Server. Therefore they created a new SQL user login and gave the user read permissions on the view.

Read permission on a view

However, when synchronizing the data dictionary in Dynamics AX 2009, the views are droped and recreated and the permission on the object is lost. Therfore the webservice call fails.

One way to address this issue from a SQL perspective is to create a DDL trigger that sets the permissions on the view programmatically. Here is a small SQL script that sets read permissions for the user view_user on the the DIRPARTYVIEW after the view has been created again.

CREATE TRIGGER [VIEW_PERMISSION] 
ON DATABASE 
    FOR CREATE_VIEW
AS 
BEGIN
    DECLARE @name SYSNAME
    SELECT  @name = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','SYSNAME')

    if @name = 'DIRPARTYVIEW' begin
        GRANT SELECT ON [dbo].[DIRPARTYVIEW] TO [view_user]
    end
END
GO

ENABLE TRIGGER [VIEW_PERMISSION] ON DATABASE
GO

Dynamics 365 FO: Export Entity Store to Azure Data Lake

Since version 10 Dynamics 365 for Finance and Operations supports the entity store export to Azure data lake. The main benefits are reduced costs because Azure Cloud storage is cheap and easy access for Business Intelligence tools like PowerBI.

If you are running a local development VM, the data connection tab in system parameters ist deactived by default. However, this can be actived using the SysFlighting table.

The configuration is pretty well documented by Microsoft. I’ve performed all the necessary steps and recorded a video:

Connect Azure Data Lake Storage with PowerBI dataflow

PowerBI dataflow performs ETL (Extract Transform Load) workloads in the cloud. PowerBI Pro and Premium Users get dataflow storage without additional charges. However, this storage is managed by PowerBI and you cannot access it directly. Therefor BYOSA (Bring Your Own Storage Account) is support to connect you own Azure storage account with PowerBI dataflow. I’ve made a video, following the documentation, how to connect an Azure storage account with PowerBI. Please find my video youtube:

Configure Azure Data Lake storage with PowerBI dataflow