Towards Trust in Complex Cloud-based ERP Systems ​by Informing Users about the System Status

Modern cloud-based ERP systems are complex distributed software applications. These systems became more powerful over the last decades and provide more features to satisfy user needs than previous generations of ERP systems. Furthermore, they are integrated with other cloud-based systems. [Question/problem] The resulting increase in complexity leads to a higher probability of failures within this integrated system. This makes it difficult for users to fully understand these systems and even qualified key users don’t have an overview of possible system issues anymore. As a result, the number of support calls and diffuse support ticket requests have increased in the last years. Moreover, ERP partner organizations such as insideAx experience that users lose trust in their systems. [Principal idea/results] The goal of our work is to foster the trust of ERP users in cloud-based ERP systems and to reduce the number of unnecessary support requests, by enhancing existing user feedback and monitoring mechanisms and provide a visualization of system health indicators to users. Overall, these visualizations and explanations of the system health status need to be easy to understand by users. [Contribution] In this workshop paper, we provide insights from industry on how to foster user trust in complex software systems and depict a conceptual solution which makes use of system monitoring data to communicate the system status to users in a simple and understandable way. Our conceptual solution, for which we also provide a first implementation architecture proposal, foresees that simple modifications to the software and ambient light devices allow to build such visualizations.

The 5th International Workshop on Crowd-Based Requirements Engineering (CrowdRE’21) https://crowdre.github.io/ws-2021/

Use Power BI dataflow to decouple report design from ETL logic in an ERP upgrade project

A common requirement during an ERP upgrade project (e.g. from AX 2012 to D365 Finance) and transition phase is to include both systems in the BI or reporting environment. Because of its tight integration with Dynamics, in many cases PowerBI is the preferred reporting and BI platform. PowerBI is capable to combine different data sources like OData feeds from D365 and SQL connections via gateway. However, for the person developing reports, it will become complicated to integrate cloud and on-prem datasources. For example, to create a sales report, one would need to include the Customers, SalesInvoiceHeader and SalesInvoiceLine entities as well as the CustTable, DirPartyTable, CustInvoiceJour and CustInvoiceTrans tables.

Different data sources in one PowerBI report

One way to address this issue can be to separate ETL logic from report design. PowerBI supports this approach by using dataflows. By using dataflows you can place PowerQuery logic direct in the Microsoft cloud and offer reuseable data artefacts. People designing reports simply connect to the dataflow but are not concerned with the ETL logic required to combine data from the old AX installation and a new Dynamics 365 ERP cloud environment.

Use PowerBI dataflow to decouple ETL logic from report design

Example

From PowerBI workspace create a new entity using dataflow. Choose the OData feed for Dynamics 365 and provide the URL for the CustomersV3 entity.

OData feed for entities from Dynamcis 365 Finance

Clicking next will open the Power Query editor and load the customers from Dynamics 365 Finance. Remove all the fields you don’t need in your application. In this example I’m using the DataAreaId, Account, Name, Group, Address and Delivery mode + terms.

PowerBI dataflow based on Dynamics 365 Finance OData CustomerV3 entity

For an on-premises AX 2012 installation you need to install a data gateway, so PowerBI can access the local SQL database. If you already have a gateway, create a new dataflow in PowerBI and use the SQL connection. I’d recommend to create a view on the database instead of loading tables in PowerBi.

CREATE VIEW [dbo].[PBIX_Customer] AS
select
DataAreaId, DirPartyTable.NAME, ACCOUNTNUM, CUSTGROUP, TAXGROUP, LogisticsPostaladdress.ADDRESS, DlvTerm, DLVMODE
from CUSTTABLE
join DIRPARTYTABLE
on CUSTTABLE.PARTY = DIRPARTYTABLE.RECID
join DIRPARTYLOCATION on
DIRPARTYTABLE.RECID = DIRPARTYLOCATION.PARTY
join LOGISTICSPOSTALADDRESS
on DIRPARTYLOCATION.LOCATION = LOGISTICSPOSTALADDRESS.LOCATION
where
LOGISTICSPOSTALADDRESS.VALIDFROM <= GETDATE() and LOGISTICSPOSTALADDRESS.VALIDTO >= GETDATE()
GO

Choose SQL Server data source for PowerBI dataflow

Select the data gateway and provide a user to access the database

Connect a PowerBI dataflow to your on-premises AX 2012 database using a gateway

Select the view and load the AX 2012 data to PowerBI. Save the dataflow

Dynamics AX 2012 customer data via data gateway

After you have created both dataflows return to your workspace, go to your dataflows and refresh both to load the data.

Refresh dataflow from Dynamics 365 Finance and Dynamics AX 2012

Next, create a third dataflow to combine the data from the Dynamics 365 Finance and AX dataflow. This time choose to link entities from the other dataflows:

Link PowerBI entities via dataflow

Select both dataflows

Select PowerBI dataflows to merge

In the Power Query Online editor rename the fields in both dataflow entities so you can append both queries. Be aware that Power Query is case sensitive and dataAreaId is not the same as DATAAREAID. When you have done this, append both queries as new one.

Append queries in PowerBI

From the new query make sure to remove duplicate customers

Remove duplicates in Power Query Online

If your have a PowerBI Pro but not a Premium subscription, deactivate load of the underlying queries.

Deable load when using PowerBI Pro

Save and refresh the dataflow. From the settings schedule the refresh and endorse the dataflow as “Promoted” or “Certified”. This is not necessary but it adds a label to dataflow and your report designer users see that they can trust the datasource. In PowerBI Desktop open Get-Data and choose PowerBI dataflow as data source:

Get data from PowerBI dataflow

Select the merged Customer data source.

Promoted and certified PowerBI dataflows

You can use the dataflows in your PowerBI datamodel but dont have to worry about the logic behind

Linked dataflow sources in a PowerBI data model

Conclusion

Using dataflows has some advantages. It helps you to decouple ETL logic from design logic. Especially when working with older versions of Dynamics AX you have to have deeper knowledge about the data structure. Another advantage is the reuse of dataflows. Typically you are not creating 1 single report, but more reports that require the same dimensions e.g. customers. By using dataflows you don’t need to maintain the load and merge in multiple PowerBI files.

Green IT Consulting

A recent study revealed that living vegan can save up to 670 kg Co2 a year. But not only veganism saves Co2, using public transport like train instead of car also helps to save the environment. Public transport companies, like the Austrian Raiload Company (ÖBB) rely 100% on green electricity and explicity promote the Co2 reduction on their train tickets.

Co2 avoided compared to driving by car on train ticket

Therefore I collected all my train tickets from the last year (2018) and summed up Co2 reduction. Moreover, I looked up the distance in km and calculated the costs for the company I’m working at if they had to pay me for driving by car. I compared it to a mean ticket price, because the price varies on how early you buy the ticket e.g. between 9€ to 35€ for a ticket from Linz to Vienna

DateFromToCo2 Red. (kg)Distance (km)
18.1.2018LinzVienna38,1178
20.1.2018ViennaLinz38,1178
30.1.2018LinzSalzburg26,0130
31.1.2018SalzburgLinz26,0130
11.2.2018LinzZurich119,2581
15.2.2018ZurichLinz119,2581
20.2.2018LinzFrankfurt114,8541
24.2.2018FrankfurtLinz114,8541
11.4.2018Linz Vienna 38,1178
13.4.2018 Vienna Linz38,1178
1.5.2018LinzBudapest91,0434
4.5.2018BudapestLinz91,0434
12.5.2018Linz Vienna 38,1178
13.5.2018 Vienna Linz38,1178
23.5.2018Linz Vienna 38,1178
25.5.2018 Vienna Linz38,1178
1.6.2018Linz Vienna 38,1178
2.6.2018 Vienna Linz38,1178
20.6.2018Linz Vienna 38,1178
21.6.2018 Vienna Linz38,1178
23.8.2018LinzInnsbruck64,5313
24.8.2018InnsbruckLinz64,5313
16.10.2018Linz Vienna 38,1178
19.10.2018 Vienna Linz38,1178
12.11.2018 Vienna Linz38,1178
30.11.2018Linz Vienna 38,1178
1.12.2018 Vienna Linz38,1178
3.12.2018Linz Vienna 38,1178
7.12.2018 Vienna Linz38,1178
Sum:1554,9 kg7380,0 km
€ per km0,42 €3099,3 €
Mean ticket Price30 €870 €

The numbers show that just by using public transport for some business trips (not all) I was able to save a huge amount of Co2. Compared to the study, twice the amount of Co2 like living vegan 🙂 Moreover, using train instead cars save the company a lof of money: 2229€ in one year. My conclusion is that it’s not necessary to change the complete way of living to make a sustainable impact on Co2 reduction.

Feature-Based reuse in the ERP domain: An industrial case study

Enterprise Resource Planning (ERP) system vendors need to customize their products according to the domain-specific requirements of their customers. Systematic reuse of features and related ERP product customizations would improve software quality and save implementation time. In our previous research, we have developed a tool-based approach supporting feature-based reuse of ERP product customizations. Our tool environment automatically infers reusable features from requirements and their associated implementation artifacts. Furthermore, it allows domain experts to search for features based on natural language requirement descriptions representing the needs of new customers. Matching features can be automatically deployed to a new ERP product. In this paper, we present an industrial evaluation of our tool-based approach conducted together with an Austrian small- to medium-sized enterprise. A domain expert used our approach to identify matching features for 20 randomly selected requirements for new customer products and identified matching features for 17 of the 20 requirements. We compared the time needed to identify and deploy the candidate features with the time required to implement the requirements from scratch. We found that, in total, over 60% implementation time can be saved by applying our reuse approach presented in this case study.

The actual paper regarding reusing of ERP customizations across multiple instances for Dynamics AX has been presented at the 22nd International System and Software Product Line Conference 2018 in Gothenburg (SE). The paper is available at ACM

Setup multiple developer VMs for Version Control with Team Services in Dynamics 365 for Finance and Operations

Here is a walkthrough how to connect two Dynamics 365 Finance and Operations developer VMs with VSTS.

Configure Azure AD and setup Visual Studio Team Services

Before you start, you have to add your developers to Azure Active Directory. If you have Azure AD Connect make sure the accounts have been synced to the Cloud. In my case I added two additional users to my Azure AD.

Configure Developer Accounts in Azure AD

Next logon to your Azure Portal using your Organization Account. Create a new service and search for “Team Services”. You should find Visual Studio Team Services (preview).

Create Visual Studio Team Services project in Azure Portal

Create a new instance of VSTS. The basic version of VSTS for 5 users is free. Make sure to use Team Foundation Server as Version Control system. You may choose any Methodology you like, but II ‘d recommend to go for CMMI compatible one.

Create Visual Studio Team Services project in Azure Portal

After the deployment succeeded, logon to your Visual Studio Team Services account, using  the URL https://<ACCOUNTNAME&gt;.visualstudio.com . There you see a new project. Open the project and add your developer users by clicking the (+) icon on the upper left “Members” pane. If everything is configured correctly, you should be able to add your AD users. In my example developer one and developer two.

Add developer accounts to Dynamics 365 FO project

Configure Developer VMs

If you are using the VHD Images from Microsoft, the first thing you should do is to rename the computer. If you don’t rename the VMs you will get errors when mapping the source control on multiple machines. In my case I rename the VMs to “devbox1” and “devbox2”. No domain is needed. Details here.

Rename Dynamics 365 FO developer VM

Configure first Developer Box

After the VM reboots, open Visual Studio 2015 in Admin mode. Depending on your licensing you may need to provide additional credentials to use your subscription. Don’t get confused if this may be your Microsoft ID (aka. Live ID) while you need your Organization Account to access VSTS. Zwinkerndes Smiley  From the menu bar select > Team > Manage Connections. Provide the URL for your VSTS account.

Connect to Visual Studio Team Services

After you have connected to VSTS select the project to work on. Next, from the Team Explorer open the Source Control explorer. Map the root folder to a folder on your developer VM.

Map Source Control Folder in Visual Studio

Afterwards use the source control explorer to create two new folders. One for Visual Studio Projects and one for metadata. This is where the D365 source code artifacts will be stored. Check in you pending changes. This will sync the folders to VSTS.

Map Dynamics 365 FO metadata folder

Now, in the Source Control Explorer open the dropdown Workspace and edit your workspace. Map the metadata folder to C:\AOSService\PackagesLocalDirectory.

Map Dynamics 365 FO metadata folder

From the menu bar > Dynamics 365 > Model Management > Create a new model. Give it a name and complete the wizard. This will ask you to create a new Dynamics X++ project.

Create new Dynamics 365 FO project

In the solution explorer, right click on the project and add to source control.

Check in to Source Control

Add a new element to the project, for example add a new String Extended Datatype called DMOCustomerName. In the solution explorer, right click the project and select build. After a few seconds you should see the console output “Build completed”. Check in all your pending changes.

Next, from the Team Explorer open the Source Control Explorer. You should see the model structure in the tree view. Right click on the metadata folder and select “Add items to folder”. Navigate to your model folder and there to the Descriptor folder. Add the Model Descriptor XML file. Unfortunately you have to do this manually, otherwise the second developer can sync the Folders and Extended Datatypes etc. but will not see the model in the AOT.

Add Dynamics 365 FO Model Descriptor File to Source Control

You can also inspect your code in VSTS

Dynamics 365 FO X++ Source Code

Configure second Developer Box

Make sure that the second VM is properly renamed. Open Visual Studio in Admin mode and connect to VSTS. Logon with the account of the second developer. Select the Dynamics 365 project and again in the Source Control Explorer map the metadata folder to C:\AOSService\ PackagesLocalDirectory. Checkout the latest version of the metadata folder.

Get Latest Version from metadata folder

This will create the model folder in the packages directory.

Model folder created in PackagesLocalDirectory

In Visual Studio open the Dynamics 365 Application Explorer. If the AOT is in classic mode, right click and switch to model view. Scroll down and you we will see the synchronized model and the its software artifacts.

Model in Dynamics 365 FO Application Explorer

Inferring variability from customize standard software products

PL4X Conceptual Solution

Systematic variability management is an important prerequisite for successful software reuse. However, it requires significant effort and extensive domain knowledge to document and maintain information on variability. In this paper we present a tool-supported approach which supports semi-automatically inferring variability information from customized standard software products. The approach does not only enable the identification and documentation of variability information based on existing products, it is also capable of incrementally updating this information. To guarantee quick access to reusable code artifacts (e.g. requirements, features or software components), the presented solution stores these artifacts together with related requirements and a generated variability model in an asset repository. The tool-supported approach has been applied to customizations of Microsoft Dynamics AX ERP systems. Our experiences highlight the potential and benefits of our approach compared to manually gathering information on software variability.

The paper was presented at the 18th International Software Product Line Conference 2014 in Florence. The paper is available at ACM digital library

Similarity Analysis within Product Line Scoping: An Evaluation of a Semi-Automatic Approach

Abstract: Introducing a product line approach in an organization requires a systematic scoping phase to decide what products and features should be included. Product line scoping is a non-trivial activity and traditionally consumes a lot of time and resources. This issue highlights the need to complement traditional scoping activities with semi-automatic approaches that allow to initially estimate the potential for reuse with small efforts. In this paper we present an evaluation of a tool-supported approach that enables the semi-automatic analysis of existing products in order to calculate their similarity. This approach is tailored to be used within the configuration-based systems domain, where we have used it to identify similarity within two types of industrial standard software products. The results of this evaluation highlight that our approach provides accurate results and leads to time savings compared to manual similarity analysis.

Thessaloniki, Greece

The paper was presented at CAiSE’14 (26th International Conference on Advanced Information Systems Engineering) Thessaloniki, Greece. Get the paper from Springer

2nd International Business and System Conference, Riga 2013

BSC is co-located with the 6th Working Conference on the Practice of Enterprise Modeling PoEM in Riga (Latvia)

riga

A Lightweight Approach For Product Line Scoping

Noebauer M., Seyff N., Groher I., Dhungana D.,

Many organizations providing products with common features wish to take advantage of that similarity in order to reduce development and maintenance efforts. Their goal is to move from a single-system development paradigm towards a product line approach. However, this transition is not trivial and requires a systematic scoping phase to decide how the product line should be defined, i.e. what products and features should be included and thus developed for reuse. Currently available product line scoping approaches require huge upfront investments in the scoping phase, consuming a lot of time and resources. Our experience has shown that small and medium enterprises require a lightweight approach to decide how the existing products are related to each other so that their potential for reuse can be estimated more easily. In this paper we present a conceptual solution and early tool support enabling companies to semi-automatically identify similarity within existing product configurations.

 

The paper “A Lightweight Approach for Product Line Scoping” was recently presented at the Euromicro Conference on Software Engineering and Advanced Applications in Cesme (Turkey). The paper can soon be found in ieeexplore.

Radisson Blu Conference Hotel

The Beach at Radisson Blu Conference Hotel in Cesme

VaMoS 2012 Paper: Managing variability of ERP ecosystems

The paper is about variability in ERP ecosystems, especially the situation of Dynamics AX partner companies. We present an idea how partner companies could manage the variability in their solutions to set up a product line. The paper can be found at the ACM Digital Library

Battle of Nations Monument

Battle of Nations Monument near Leipzig