I am seeing more and more customers and partners who actively promote the use of Azure API Management for D365 F&O backend and/or D365 Commerce Scale Unit ODATA interfacing. See for example, this blog post by Adrià Ariste Santacreu. However, there’s still a significant number of customers and partners who shrug their shoulders while thinking: “Why would we use it when we can interface ‘directly’?”. In this blog post I’ll address this question by sharing my best personal best practices in using Azure API Management for D365 F&O backend and/or D365 Commerce Scale Unit ODATA interfacing. But before sharing the best practices, we first have to explode the myth that interfacing via Azure API Management (APIM) would not be DIRECT.. 😉.Read more
In most projects, customers and partners are deploying Build and DEV boxes from LCS\Cloud hosted environments. These utilise multiple Azure resources such as storage accounts, virtual network and virtual machines. In many cases, the LCS environment is working against only 1 Azure subscription which is eventually used to host the PROD environment as well. If this is the case, you can potentially save your customer quite some money. In this blog post, we’ll see how much and how!Read more
Ever since D365 Finance & Operations V10, we’re in the Managed Updates model which aligns all Customers on a similar Platform Version – It’s the so called “One Version” strategy which brings a lot of benefits. But is there really ONE VERSION in reality? We have License keys, Parameter modules, SysFlighting and Feature Management which already contains 250+ Features in V10.0.11 PEAP. Did you all enable them? If you sum all these parameters, D365 Finops installations can fundamentally differ even if the underlying Platform version is similar.
Now with D365 Retail (Commerce) there is yet another variable – A relatively new Retail/Commerce Parameter area which is controlling Retail functionality behind the scenes… In this blog post we’ll dig out all the Features that can be enabled from here. Fellow Functional and Technical D365 Retail/Commerce consultants sit tight: this area is likely to become more and important in the future. It can also be really helpful for your own customisations.Read more
After walking through the overall D365 Retail API Architecture in part I and II of this series on the D365 Retail APIs, it’s now time to enable anyone to actually use the Retail APIs. To allow this, this blog post will contain all details on how to construct a request to most of the 400 out-of-box D365 Retail APIs – Beyond that, I’ve included a step-by-step instruction on how to use any of the Retail APIs from Microsoft Power Automate (Flow).
But before we can really get things rolling, we first need to apply some additional D365 setup and we need to choose the right security pattern. So that’s where I’ll start this blog post.Read more
Did you know that D365 for Retail contains more than 400 out-of-box Retail APIs (for the non-techies: interfaces) ranging from Loyalty management, Gift card management, Omni order management to Stock counting and Retail Logistics? These interfaces allow you to easily integrate with any 3rd party systems like e-commerce platforms, POS systems and Consumer Apps. It also allows you to quickly enrich your Retail Solution with handy Power Apps, for example for in-store Goods receipt, Stock counting, Labelling or Clienteling.
Unfortunately I still notice that many Customers and Partners do not consider the Retail APIs in their designs, which goes around all the ‘free’ Retail Business Logic which comes with them. Instead, Customers and Partners start a custom coding exercise (re-inventing the wheel!) which often limits rather than empowers the Omni-channel capabilities of D365.
This is all mainly caused by the lack of documentation – It’s time to change that! In this series of blog posts we’ll explore the Power of the Retail API’s so we can all benefit from them by using them to enhance the Retail experience for the Retail Businesses we serve. As we need the non-techies on this journey, I’ll try to make it understandable for all of us. Making the right Architectural decisions starts with a good understanding of the solution. So let’s dive into that first!
1 – D365 Retail API Architecture explained
D365 Retail comes with 3 major components – Below description is a little bit technical, but it’s good to keep these components in mind for a better understanding of the main picture (picture 1) below:
Let’s go through each element in the picture, step-by-step:
- The Letter. An app (e.g. a POS system, e-commerce platform or Power App) can send a request to 1 of the 400+ D365 Retail end points. This request can be to Request information such as the loyalty balance of a customer or to Send information.
- The public Mailbox. Metaphorically you can compare this request with a letter which is dropped into a mailbox. The mailbox in this case is the ODATA end point of Retail Server (1A). It’s a public mailbox, so any App can drop letters in there. But there are 2 prerequisites which need to be met to allow this: the App needs to be a registered ‘member’ in the Receiver’s domain and the App must have received a pre-authorisation code (token) which needs to be sent with the letter.
- The Internal Mailbox. Once successfully received in the public (ODATA) Retail Server mailbox, the letter (request) is forwarded to an internal mailbox: a specific Commerce Run Time (CRT) operation (1B). This CRT operation is responsible for processing the letter. In order to get this done, the CRT operation contacts 1 or multiple Retail Services (1C).
- Processing of the letter. The Retail Services work with subcontractors. There are 2 types of subcontractors: Data services which can read/write information from/to the local Retail Database or Remote Data services which can receive/send information from/to the D365 backend. The Retail Services have been instructed (programmed) to contact either of the subcontractors or both subcontractors to process a certain letter (request). For example, a letter to update the Loyalty Balance for a customer is forwarded to the D365 backend (since other Apps should work against the latest balance) whereas a new customer is created both in the local Retail Database and forwarded to the D365 Backend. In some case the Retail Service has to read specific parameter settings first to determine which subcontractor to forward the work to. These parameters have been set in D365 backend but are locally available for the Retail Services after sync to the local Retail Database. For example, a parameter which controls whether e-com orders are created into the local Retail database (and from there synched to D365 backend in bulk) or whether these orders are forwarded to the D365 backend in real time.
- Another Public mailbox. Remote Data Service ‘Subcontractors’ will drop the original letter in the Public mailbox of the D365 Backend: a SOAP mailbox, often referred to as Real Time Service (2A). This mailbox has a trust relationship with the Remote Data Service ‘subcontractor’ which is confirmed by a certificate.
- Another Internal mailbox. Very much similar to the way Retail Server is organized, the agent looking after the Real Time Service mailbox, forwards the letter to an internal mailbox for processing: the RetailTransactionService class in the D365 Backend (2B). From there, Retail Services in the D365 backend take care of the actual processing which ties into many standard D365 (Retail) classes packed with Business Logic (2C and 2D). For example in case of customer creation, these services will include the customer in the address book of the Retail channel, assign Retail affiliations and Customer attributes. In other words: even though not all this information was part of the original letter, the Retail Services ensure the letter is processed in line with D365 (Retail) Parameters settings and Retail Business Logic.
- Response. As a sender of a letter, you’d love to get a response on your respect, right? This is organized as follows: whenever an exception occurs somewhere across the chain of ‘middlemen’ subcontractors as described above, the exception is written into a response letter which is returned to the sender (the App) following the chain in reverse order. If no exceptions occur, the original letter will reach it’s final destination. There, a response letter is written which is an in-depth response on the actual request, for example a response letter full of information on a newly created customer. Similar to what happens in case of an exception, the response letter is returned to sender following the chain in reverse order.
In below video I am trying to illustrate how powerful the Retail APIs can be. In the video, I am using Microsoft Forms (an Office 365 App) to capture information as part of an imaginary Customer Sign up process. Power Automate picks up the information and forwards it to D365 Retail Server to create the customer in the D365 backend in real time. As you will see in the video, the standard D365 Retail Business logic is fully utilized here as if the customer was captured on D365 MPOS: for example, the new customer is automatically included in the Retail channel’s address book.
3. The potential
The left Table below provides a break down of the Retail areas the 400+ Retail APIs apply to. A significant number of these APIs are built to provide real time data exchange with the D365 backend. The right table below presents the Retail areas for which real time connectivity to the D365 backend is supported.
Looking at the functional breadth of the Retail APIs and the video in this blog post, I hope you’re inspired to think about all the opportunities provided by the D365 Retail APIs. In upcoming blog posts on this topic, we’ll make deeper dives into the various aspects of these Retail APIs so we can all use this knowledge to further enrich the D365 Retail experience for our customers.
Now that we’ve constructed a ‘RSAT connector for Microsoft Flow’ in Part I of this Series, we have lots of new opportunities offered by the Microsoft Power Platform to push our Test Automation in D365 to a new level.
In below video I’ll demonstrate how toggling the State of an Azure DevOps Test Suite to “Run RSAT” triggers MS Flow to iterate through the Test Cases in the Test Suite and let RSAT execute the Test Cases (D365 Task Guides) in the background. MS Flow will wait until RSAT execution is finished to send an e-mail to notify that RSAT execution is done. The e-mail will have the logs from the RSAT execution as attachments.
See my OneDrive for the Flow which is used in the video.
Note: I didn’t manage to import this Flow package into another tenant than the tenant where the Flow was originally built – MS Flow complaints about a problem with the authorisation against Azure DevOps – So you may have to amend the packaged (zipped) json definition of the Flow to allow import. Please share an updated .zip file if you manage to make amendments to the packaged .json definition which allows import.
We’re now in the new world of D365 “OneVersion” – A world in which we can benefit from continuous Microsoft investment without having to re-implement ERP ever again. But the new world also requires a solid Regression Testing Strategy to control the cost for consuming new updates. Here’s where Microsoft stepped in giving us the Regression Suite Automation Tool (RSAT). Although this tool is a good starting point, anyone who ever worked with RSAT will probably recognise one or more of the following limitations:
- I don’t want to go into a VM to run test cases!
- How can I test my D365 interfaces???
- How can I run multiple test cases/test suites in parallel (in other words: multi threaded)??
- I don’t want my chained tests to be scripted in Powershell, but in Azure DevOps (low code/no code solution!)
- How can I run mixed tests of process and interface? – e.g. to support E2E scenarios like: my e-com platform creates a sales order through an interface into D365 which is Released to warehouse through RSAT?
We can of course wait for Microsoft to come up with solutions to overcome these limitations. In my case, I decided to accept the challenge to leverage the Power Platform to fill the gaps. In this series of blog posts I’ll gradually tackle each of the above limitations one by one.
Key step in the overall solution is to find a way to move away from executing our D365 test cases directly from the RSAT tool and to bring this into a higher control layer with broader capabilities than RSAT alone… Please welcome an RSAT connector for Microsoft Flow!
Get started with the QuickStart Guide I published on my OneDrive: