Use Integration Account to Store Schemas and Maps
Azure Logic Apps provides a very efficient way to map the outputs from triggers and actions into new actions. This capability is enabled, in-part due to, Open API/Swagger. But, what happens if you aren’t trying to push data to a connector that uses Open API? The answer lies in in the Integration Account.
The Integration account acts as a container for storing Schemas, Maps and Assemblies. Using these artifacts allows for richer transformations, such as XML to CSV, B2B Electronic Data Interchange or complex data mappings that involve large enterprise systems like SAP.
The process for using the Integration account is as follows. We will walk through this process in this post.
The tooling for creating Schemas, Maps and Assembles is available in Visual Studio 2015 and the Azure Logic Apps Enterprise Integration Tools for Visual Studio 2.0. Once these tools are installed, we can focus on creating an Integration account in the Azure Portal.
There are 3 different tiers available for Integration accounts and they each have entitlement quotas. Please review the following link to determine the correct tier for you. For this post, we will just provision the free tier.
With our Integration account created we can view the components that exist within it. Naturally, since we just created it, it is empty.
Let’s now head over to Visual Studio 2015 where we can author our schemas and maps. Since we installed the Enterprise Integration Pack, we will have a new Visual Studio project type called Integration Account. We will create a new instance of this project type.
With our project created, we can now add new items to our project much like any other visual studio-based project. In our scenario, we will add two XML schemas and one Map.
Our (simple) solution is now complete. We have a source schema, a destination schema and a map that will transform the source input into the destination output. We also have access to functoids, so I added a string concatenate functiod that will merge two source data elements into one as the output schema requires it.
Before we upload our artifacts to Azure, we should build our project. The reason for this is that an XSLT file will get generated which we will use as our map inside of Logic Apps. If we don’t, we will receive a “The provided map definition is not valid.” error message when we try to upload our .btm file.
We have a few options of uploading these artifacts into our Integration account in Azure, including PowerShell, REST API, ARM templates and the Azure portal. For this post we are going to focus on using the Azure Portal. To do this, we need to access our Integration account and then click on the Schemas tile. We can now add our schemas by clicking on +Add.
We also need to upload our Map using a similar process. But, this time we will use the output from building our project and use the XSLT file that can be found in our bin/debug folder.
With our artifacts now uploaded, we can use them within a logic app. Our logic app will be quite simple. and can be created like we normally would do so. But, we do need to associate our Integration account with our logic app before we can use it in any actions. To do this we need to access the Workflow settings for our logic app and then select our Integration account and then press Save.
Next, we will update our logic app and expose an HTTP Request trigger and then we will add a Transform XML action.
Azure Logic Apps is able to perform dynamic message type look-ups. As a result, we only need to provide our Map name and provide Content, which will come from our HTTP request. This differs from BizTalk where we had to also identify our input and output messages.
We will use an HTTP Response action to return our transformed message back to our caller.
We can use PostMan to send in an XML message that matches our source schema. Logic Apps in turn will transform this message and return an XML message that matches our destination schema.
In this post we discussed Integration accounts and using them to address a common scenarios found in Enterprise Integration: data transformation. Using this approach will aid in delivering solutions for B2B EDI, trading partner integrations and Line of Business systems integration including SAP.