Power Automate Archives - Aric Levin's Digital Transformation Blog http://aric.isite.dev/tag/power-automate/ Microsoft Dynamics 365, Power Platform and Azure Wed, 11 May 2022 17:22:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications http://aric.isite.dev/dynamics/post/2021-wave2-uci-in-app-notifications/ Sun, 17 Oct 2021 06:55:00 +0000 https://aric.isite.dev/index.php/2021/10/17/power-platform-2021-release-wave-2-unified-interface-updates-in-app-notifications/ n preparation for our upcoming NYC BizApps event on October 27, 2021, and the Power Platform 2021 Wave 2 release (weekend of October 23/24), I am writing a series of blog posts related to some of the most sought after updates. In this post we will review the new Notifications table and In-App Notifications.

The post Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications appeared first on Aric Levin's Digital Transformation Blog.

]]>
In preparation for our upcoming NYC BizApps event on October 27, 2021, and the Power Platform 2021 Wave 2 release (weekend of October 23/24), I am writing a series of blog posts related to some of the most sought after updates. In this post we will review the new Notifications table and In-App Notifications.

Although the In-App Notifications are documented as part of the Power Platform 2021 Release Wave 1, and was make available for Public preview in July, it is not yet available as part of GA, so I will be reviewing this new feature here.

So, what is In-app notifications. In-app notifications provide us the ability to alert users on certain processes that have been completed or required their attention, and these are displayed in their Model-driven app as part of the new notifications icon (or notification center).

This notifications feature is not available by default in every environment, and require a System Administrator or App Maker to make this feature available within the app. In order to enable this, we need to call the following statement (which can be done from the console of our browser window while running our model driven app.

fetch(window.origin + "/api/data/v9.1/SaveSettingValue()",{

 method: "POST",

   headers: {'Content-Type': 'application/json'},

   body: JSON.stringify({AppUniqueName: "ENTER_APP_UNIQUE_NAME", SettingName:"AllowNotificationsEarlyAccess", Value: "true"})

   });

An example of the Unique App Name would be: crde4_MyApp. Once this feature is enabled we will be able to use the Notifications table to display notifications to end users.

Now that we configured our environment, let’s go ahead and create a notification. There are a few required parameters (or recommended) that we need to add in order to display the notification, such as the title, owner and body. The owner of the notification is the user that this notification will be displayed for. There are additional options such as Icon Type, Toast Type Expiration and Data which is a JSON string that is used for extensibility and parsing richer data into the notification. The screenshot below shows you the properties

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Notification Table Properties

You can read the Microsoft blog on Notification, by clicking on the link below:
https://docs.microsoft.com/en-us/powerapps/developer/model-driven-apps/clientapi/send-in-app-notifications#notification-table

Most of the sample in the above link are visible via JavaScript. Let’s demonstrate how this would look using Power Automate flow.

In our example we will create a flow that displays a notification when a new account is created. This is a simple flow, without any additional parameters, or configuration of the JSON string.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Basic Flow

Let’s go ahead and create the record. As this logic does not require to create a lot of data, I will just enter some minimal data. The only parameters that the flow is actually using is the name of the account and the creator. The image below shows the record that we created.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Create New Account to trigger flow

Once we create the new record, we will receive a notification in the Notification Center or as a toast that the account has been created as shown in the image below.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Basic Notification in Notification Center

Now, sometimes the toast notifications are not immediate, and we might have saved the record and closed it, so we would like to have a link back to the record. So let’s go ahead and first modify the flow and see how this works in action. We will add the Data element to create our custom JSON string, so that the user can access the record that was created. The image below shows the changes to the Add New row of the flow to enable a click-through, so that you can open the record that was created.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Flow with JSON Data/Action

Once the flow executes, the notification will be displayed to the end user, with a link to navigate to the correct record. Note that sometimes notifications don’t appear immediately and there is a slight delay, but as mentioned previously, this is still in preview.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Notification with Action in Notification Center

Additional posts related to the Dynamics 365 and Power Platform 2021 Release Wave 2 will be posted by following the link below:

Power Platform 2021 Wave 2 Release Posts

The post Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications appeared first on Aric Levin's Digital Transformation Blog.

]]>
Calling an Azure Pipeline using a Cloud Flow or Canvas App http://aric.isite.dev/azure/post/calling-azure-devops-pipeline-flow-canvas/ Mon, 22 Feb 2021 04:35:00 +0000 https://aric.isite.dev/index.php/2021/02/22/calling-an-azure-pipeline-using-a-cloud-flow-or-canvas-app/ With everything that is going on around ALM and CI/CD and how to implement an automated deployment process with Azure Dev Ops of Github actions, there are still many organizations that have a lot of work that needs to be done before they can turn the switch over.

The post Calling an Azure Pipeline using a Cloud Flow or Canvas App appeared first on Aric Levin's Digital Transformation Blog.

]]>
With everything that is going on around ALM and CI/CD and how to implement an automated deployment process with Azure Dev Ops of Github actions, there are still many organizations that have a lot of work that needs to be done before they can turn the switch over.

In this post, I will show how we can use Power Automate Cloud flows to initiate an Azure DevOps Pipeline, and in turn use a Canvas App to call the flow that will call the Azure DevOps pipeline.

There are a few prerequisite steps that must be done, and you might already have done them, but I would like to review them again in case they have not been completed.

In order to call the Azure DevOps REST API we need to have a Personal Access Token. That can be acquired by going to our Personal Account Settings, and selecting Personal access tokens as shown in the image below:

Azure DevOps Pipeline - Cloud Flow - Personal Access Token Menu

If you already have a Personal access token, clicking on the above link will show you the list of available tokens, but if you don’t you will see a screenshot saying that you don’t have a personal access token yet, and you can click on one of the New Token links to create a New Access Token.

Azure DevOps Pipeline - Cloud Flow - Personal Access Token - New Token Button

You will need to give your Personal access token a name, specify the organization and expiration date, as well as the Scope of the token. For the purpose of this demonstration, I have given Full access, but you can also provide a custom defined scope, which lets you determine whether you want to provide Read, Write and Manage access to the different objects that make up the API.

Azure DevOps Pipeline - Cloud Flow - Personal Access Token - New Token Dialog

Once you click on the Create button, you will see the Success notification showing you the Personal access token, and an icon that allows you to copy the token to the clipboard. It is important to copy it and store in a safe place, as once the window is closed, this token will no longer be accessible.

Azure DevOps Pipeline - Cloud Flow - Personal Access Token - New Token Confirmation

Not that we have the token, we will need to convert it to base 64, so that it can be used in our HTTP call in the flow, passing it as a Basic Authentication token. This can be done by using Git Bash or other utilities that allow you to convert to Base 64. Open Git Bash, by clicking on Start -> Git and then selecting Git Bash.

Azure DevOps Pipeline - Cloud Flow - Open Gitbash

Once Git Bash is opened, you will need to enter the following command in the Bash window:

$ echo -n :[PERSONAL_ACCESS_TOKEN] | base64

You will need to replace the [PERSONAL_ACCESS_TOKEN] with the real access token that you copied earlier. The following screenshot (with some blurring) shows the Git Bash command.

Azure DevOps Pipeline - Cloud Flow - Gitbash Command

Copy the result into Notepad++ or another text editor, as you will need it at a later time.

The next thing that we will need to get is the Pipeline Id that we are going to call. Navigate to Azure DevOps and click on Pipelines. This will give you the list of Pipelines as shown in the image below.

Azure DevOps Pipeline - Cloud Flow - Pipeline View

Click on the Pipeline that you want to execute from your Cloud flow or Canvas app, and let’s examine the Url. You will notice that the end of the url contains a definition Id. That is the Pipeline Id that will will need to use in order to execute the Pipeline.

Azure DevOps Pipeline - Cloud Flow - Pipeline Url (Definition Id)

Next, we will be creating a table in Dataverse, so that we can store the executions. Although not required, I like having a log of who executed these processes and when. Mostly for historic purpose.

The columns that I added are the following, but additional columns can be added.

Azure DevOps Pipelines - Cloud Flow - Dataverse Execution Table

Let’s go ahead and look at the flow that I added to the Solution. I start the flow using a Power Apps trigger, and initializing three variables containing the Pipeline Name, Pipeline Id and the User Email that is executing the flow. The image below shows these steps

Azure DevOps Pipelines - Cloud Flow - PowerApps Trigger and Init Variables

You will notice that each of the variables that are being initialized use the “Ask In PowerApps” option, so that the value is initialized from my Canvas App. The next step is to call the REST API using a HTTP Post request. The url below is the Microsoft Docs url containing the information of the Pipelines REST API:

https://docs.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run%20pipeline?view=azure-devops-rest-6.1

Within the document, it details the actual url to be used in the HTTP request, which is:

https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}/runs?api-version=6.1-preview.1

You will notice the HTTP post request below. The blurred section contains the organization and project from DevOps within the URI parameter, and in the Header, we paste the result that we go from the conversion of our Personal access token to base 64 in Git Bash.

Azure DevOps Pipelines - Cloud Flow - HTTP Action

At this point, the execution will commence, and the final stage that I demonstrate below are really optional, but a nice to have. If I would like to write the results back to my table in my Dataverse instance, I can retrieve the Build Id and Build Url from the result of the HTTP. I can do this by using a Parse JSON request, which will give me the properties that I need.

The Parse JSON contains the Body from the HTTP request, and then a copy of the Schema. I can run the flow before adding the last two steps, and then get the JSON result to be pasted in the run results of the Cloud flow HTTP action step, by pasting them in the Generate from sample below which will generate the Schema.

Azure DevOps Pipelines - Cloud Flow - Parse JSON Action

You can find the actual needed schema for this step pasted below.

[Parse JSON Schema]

Finally the last step is to add the record to Dataverse. We have the Pipeline Id, Pipeline Name and User Email being passed from the Canvas App. The Build Id and Build Url are the results from the Parse JSON request, which are basically body(‘Parse_JSON’)?[‘id’] and body(‘Parse_JSON’)?[‘url’].

Azure DevOps Pipelines - Cloud Flow - Create Dataverse Row Action

Once the flow is execute we will see a new record in my Dataverse instance.

Azure DevOps Pipelines - Cloud Flow - Dataverse Execution Results

We will also see that the Azure DevOps pipeline is being initialized

Azure DevOps Pipelines - Cloud Flow - Azure DevOps Execution Results

Now, in order to execute this, I create a Canvas App, that will have all the Pipelines that are part of my process. This might be temporary pipelines are automated and scheduled pipelines. The app is shown below

Azure DevOps Pipelines - Cloud Flow - Canvas App Execution

When you click on the Run Pipeline button under each of the pipelines, it will call the Power Automate Cloud flow by using the following Canvas App Command.

InitializeDevOpsPipeline.Run(“PublishExportUnpackGit”, 3, User().Email)

This can be of course enhanced further, but for an initial execution of the pipelines this is a great help. It is the first step in beginning to have an ALM. Like always, I hope this was helpful to some of our community members.

I also want to give a special thanks to Paul Breuler for Microsoft for helping me out in some of these challenges.

The post Calling an Azure Pipeline using a Cloud Flow or Canvas App appeared first on Aric Levin's Digital Transformation Blog.

]]>
Adding Membership selections to Canvas Apps http://aric.isite.dev/dynamics/post/add-membership-canvas-app/ Sun, 31 Jan 2021 23:17:00 +0000 https://aric.isite.dev/index.php/2021/01/31/adding-membership-selections-to-canvas-apps/ As always, I try to bring some real world scenarios that I was required to implement and modify the logic a little bit. In today’s app we will be creating a project record and adding and removing membership to the project while adding custom logic to adhere to special circumstances. You can do this of course with a model-driven app, but the main requirement was to make sure that the percentages of ownership on the Project is always 100%.

The post Adding Membership selections to Canvas Apps appeared first on Aric Levin's Digital Transformation Blog.

]]>
As always, I try to bring some real world scenarios that I was required to implement and modify the logic a little bit. In today’s app we will be creating a project record and adding and removing membership to the project while adding custom logic to adhere to special circumstances. You can do this of course with a model-driven app, but the main requirement was to make sure that the percentages of ownership on the Project is always 100%.

The first part is to create a new project, which requires as one of the fields the name of the account on the project. When the project is saved, the primary contact on the account record is automatically added as a Project Member. The Image below shows the creation of the new project.

Canvas App - Membership Management - New Project

When the record is saved, a Cloud Flow is being called that retrieves the Primary Contact on from the account record, and creates the First Project Member on the Project. It also marks the member as Primary, so that it cannot be deleted from the App. The image below contains the steps in the Cloud Flow.

Canvas App - Membership Management - New Project - Cloud Flow

After the flow is executed, we will be able to see the project member that was added within the project record, as shown below.

Canvas App - Membership Management - New Project - After Cloud Flow

Next, let’s go ahead and look at the Canvas App. When we load the Canvas App, we have the ability to select the project that we want, and then once the project is selected, we will see the Project Members in the gallery (as shown below). You will notice that there is a single member that is visible, as this is a new project that was created, and the default member was added to the project.

Canvas App - Membership Management - Canvas Initial View

You will notice the + button above the gallery. Clicking on the Plus button will allow us to add a new Project Member. The action will show a new section on the form which will request a Contact, Date Added and Percentage, as shown below. The Contact field is filtered to only show members that have not yet been added to the gallery.

Canvas App - Membership Management - Canvas Add New Member

Once the member is added, I will be able to see the two members in my gallery. I also see that the Save button is disabled, because the total membership percentage is not equal to 100%. I will add one more member for the purpose of the demo. I now see that I have three members with a total membership percentage of 160%

Canvas App - Membership Management - Canvas Multiple Members - Unable to Save

In order for me to Save this record, I will have change the percentages so that they are equal to 100%. Let’s go ahead and do that, and then Save the record to see the results.

Canvas App - Membership Management - Canvas Multiple Members - 100%

Canvas App - Membership Management - Canvas Multiple Members - After Save

Now, let’s go back to the canvas app, and remove one of these members. You will notice that Paul Cannon, which is the Primary Member does not have a remove icon, because that is the primary member. All other members, we can delete. Let’s go ahead and delete Nancy Anderson, which has a 30% membership. The member is highlighted as removed. In order to save the changes again, we have to update the percentages to match 100%, and then save the changes. The image below shows the record to be removed and the updated percentages.

Canvas App - Membership Management - Canvas Multiple Members - Remove

Now let’s go ahead and save the record, and look at the results in the Model-Driven app. You will notice that I still see the three Project Members, but that is because I am showing both Active and Inactive Users. The member that was removed from the group is showing a date value in the Member Removed on, which allows me to keep track of the members added and removed, and a history of such.

Canvas App - Membership Management - Canvas Multiple Members - After Save (Member Removed)

Next, let’s go ahead and look at the implementation. First, when the form loads, we select a project and click on the Go button. The OnSelect method of the Go button contains logic to load the projects into a collection, and load contacts into a collection removing the contacts that are already added to the project.

Set(SelectedProject, cmbProject.Selected.Projects); 

// Add Project Members to the Gallery
Clear(ProjectMembers);
ForAll(
    Filter('Project Members', Project.Projects = SelectedProject, Status = 'Status (Project Members)'.Active),
Collect(ProjectMembers,
{
    ProjectMemberId: 'Project Member',
    ProjectId: Project.Projects,
    ContactId: Contact.Contact,
    Name: Contact.'Full Name',
    DateAdded: 'Member Added On',
    Percentage: Percentage,
    IsPrimary: 'Is Primary',
    IsDeleted: false
})
);

// Add All Contacts to the Collection to allow adding of new contacts
Clear(AvailableContacts);
ForAll(Filter(Contacts, Role = 'Role (Contacts) Decision Maker'),
Collect(AvailableContacts,
{
    ContactId: Contact,
    FullName: ''Full Name',
    IsAvailable: true
})
);

// Mark the Contacts that are already in the Project Members Gallery as not available
// so that they do not appear in the Contacts Combo Box
ForAll(ProjectMembers,
    Patch(AvailableContacts,
        First(Filter(AvailableContacts, FullName = Name)),
        { IsAvailable: false })
        );

Set(hasError, false);

The plus button, only sets an Action Type value to Add, which in turn displays the entire list of controls that are visible to allow addition of a new record. Once clicked I can see the Add Project Member, Contact, Date Added, Percentage fields, and the Accept and Cancel icons.

The Contact Combo Box, has a Filter to only show Available Contacts, using the following line:
Filter(AvailableContacts, IsAvailable = true)

Clicking on the check button (Accept), will add a new Project Team Member into the collection, and remove that member from the list of available Contacts.

If(IsBlank(cmbContact.Selected.FullName),
   Set(hasError, true), 
    Collect(ProjectMembers,
    {
        ProjectId: GUID(SelectedProject),
        ContactId: cmbContact.Selected.ContactId,
        Name: cmbContact.Selected.FullName,
        Percentage: sliderPercentage.Value,
        DateAdded: dtpDateAdded.SelectedDate,
        IsPrimary: 'Is Primary (Project Members)'.No,
        IsDeleted: false
    });
    Patch(AvailableContacts,
        First(Filter(AvailableContacts, ContactId = cmbContact.Selected.ContactId)),
        { IsAvailable: false }
        );
    Set(hasError, false)
);

Reset(cmbContact);
Set(ActionType, "");

Finally, for the Save action, I loop through all the Project Team Members, and either create a new member (based on the Project Member Id being blank), or update an existing record. The update patch can either update the percentage of the project or deactivate the project.

ForAll(ProjectMembers,
    If (IsBlank(ProjectMemberId),
        // Create New Record
        Patch('Project Members', Defaults('Project Members'),
        {
            Project: LookUp(Projects, Projects = ProjectId),
            Contact: LookUp(Contacts, Contact = ContactId),
            'Is Primary': 'Is Primary (Project Members)'.No,
            'Member Added On': DateAdded,
            Percentage: Percentage
        }),
        // Update Existing Record
        Patch('Project Members', LookUp('Project Members', 'Project Member' = ProjectMemberId),
        {
            Percentage: Percentage,
            'Member Removed On': If (IsDeleted, Today(), ""),
            Status: If (IsDeleted, 'Status (Project Members)'.Inactive, 'Status (Project Members)'.Active),
            'Status Reason': If (IsDeleted, 'Status Reason (Project Members)'.Inactive, 'Status Reason (Project Members)'.Active)
        })
    );
);

Select(Button2);

Below you will also find a video of this blog and I will add the source code to my github repository as soon as I can.

The post Adding Membership selections to Canvas Apps appeared first on Aric Levin's Digital Transformation Blog.

]]>
No-Code Solution for custom Change Log using Web Hooks and Cloud Flows http://aric.isite.dev/development/post/no-code-custom-change-log-webhook-cloud-flow/ Mon, 25 Jan 2021 00:30:00 +0000 https://aric.isite.dev/index.php/2021/01/25/no-code-solution-for-custom-change-log-using-web-hooks-and-cloud-flows/ In one of our recent requirements, we had to log changes to certain fields in a few different entities. Since we needed the value of the field before and after the change, the logical option was to use plugins and adding a pre-image step to it in order to save that data.

The post No-Code Solution for custom Change Log using Web Hooks and Cloud Flows appeared first on Aric Levin's Digital Transformation Blog.

]]>
In one of our recent requirements, we had to log changes to certain fields in a few different entities. Since we needed the value of the field before and after the change, the logical option was to use plugins and adding a pre-image step to it in order to save that data.

There are however other alternatives, that might make more sense, especially if you want to minimize the amount of code and plugins that are accessible within the system.

The solution is simple. We create a Webhook and add to it the corresponding step and Pre-Image, and upon execution it will be triggering a Cloud flow where we can use the values from the Input Parameters and the Pre-Entity Image.

Let’s go ahead and see how we implement this. The first step is to create a Cloud flow, and set the trigger to When a HTTP request is received.

Dataverse Web Hook - Cloud Flows - HTTP Request is Received

The HTTP POST URL is not available until the flow is saved, so enter the request Body JSON Schema. You will need to enter an action to Save the Flow, so you can enter an Initialize Variable, which we will use later set a name for the flow and Save the record. Once the record is saved, the HTTP POST URL will be filled in and you will be able to configure the Webhook. Copy the HTTP POST URL and paste it in NotePad or other text editor. The result should be similar to what you see below:

https://prod-XXX.westus.logic.azure.com:443/workflows/41e03b6243cc47273a71f827fb3bd29b/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=e5138Klq7cOcbG1RJ2bXA42et4vFM0-kZ3k8jyK7Txs

Add new lines between the url and the query string parameters and remove the separators so that it looks like this:

https://prod-XXX.westus.logic.azure.com:443/workflows/41e03b6243cc47273a71f827fb3bd29b/triggers/manual/paths/invoke

api-version=2016-06-01

sp=%2Ftriggers%2Fmanual%2Frun

sv=1.0

sig=e5138Klq7cOcbG1RJ2bXA42et4vFM0-kZ3k8jyK7Txs

Replace all the %2F encodings with the corresponding character (/). This will change that line to look like this: sp=/triggers/manual/run

Now, let’s go and open the Plugin Registration Tool, and click on the Register drop down and select Register New Web Hook, or CTRL+W for short.

Dataverse Web Hook - Cloud Flows - Register Web Hook

This will open the Register New Web Hook dialog when we will enter the HTTP URL and the different parameters that we specified above as shown in the screenshot below.

Dataverse Web Hook - Cloud Flows - Web Hook Registration

Now that we have registered the Web Hook, you should be able to see the new Web Hook in the list of Registered Plugins. Select the Web Hook, and select Register New Step (either by Right Clicking on the Web Hook and selecting Register New Step or from the top tab menu).

Just as you would perform this step registration process for a plugin, do the same for the Web Hook. Select the Message, Primary Entity and Filtering Attributes as shown in the image below. Of course you should customize this for your own message..

Dataverse Web Hook - Cloud Flows - Register New Step

Since we want to get the value before the change and after the change, we need to register a new image for the step of type Pre Image, as shown in the screenshot below. Add the name and entity alias, and specify the parameters that you want for you Pre Image.

Dataverse Web Hook - Cloud Flows - Register Pre Image

Now that we are done with the configuration within the Plugin Registration Tool, let’s take a look at where this data will be stored. We created a new entity called Change Log that contains information about the entity and the record that was modified, along with fields to store the values before and after the change.

Dataverse Web Hook - Cloud Flows - Change Log Table

Next, let’s go back the flow that we created. We will start by initializing a few variables that will be used to store some of the values, and then retrieve the values from the HTTP Post method.

It is a good idea to run the flow once so that you can review the Outputs of the When a HTTP request is received trigger, and that will help you with constructing the expressions needed for the different steps.

Dataverse Web Hook - Cloud Flows - HTTP Request received outputs

The variables are shown in the image below. The Pre and Post variables will be added later, while the Entity Id, Entity Name and User Id can be filled out during the initialization as follows:

  • triggerBody()?[‘PrimaryEntityId’]
  • triggerBody()?[‘PrimaryEntityName’]
  • triggerBody()?[‘UserId’]

Dataverse Web Hook - Cloud Flows - Initialize Variables

Since getting the values from the Input Parameters and the PreImage is a little more complex, I used two compose actions for these, and set the Inputs to as follows:

  • triggerBody()?[‘InputParameters’][0][‘value’][‘Attributes’]
  • triggerBody()?[‘PreEntityImages’][0][‘value’][‘Attributes’]

Dataverse Web Hook - Cloud Flows - Compose Data Operations

The final step before storing the record is doing an Apply to each of the Input parameters and check the value corresponding to the attribute that was changed. We run this for both the PreEntity image results and the Input Parameter results and have a condition to check for the Input Parameter.

The Condition logic expression is: items(‘Apply_to_each_PreEntity_Image’)?[‘key’], and the Set Variable Pre contains the following expression: items(‘Apply_to_each_PreEntity_Image’)?[‘value’]

Dataverse Web Hook - Cloud Flows - Apply to Each, Conditions and Set Variables

The same applies to the other apply to each action. Once we have retrieved all the information that we need, the change log record will be created. 

Dataverse Web Hook - Cloud Flows - Create Change Log Record

The results will look like this:

Dataverse Web Hook - Cloud Flows - Change Log Record Results

The post No-Code Solution for custom Change Log using Web Hooks and Cloud Flows appeared first on Aric Levin's Digital Transformation Blog.

]]>
Approval Process with Outlook Adaptive Cards http://aric.isite.dev/dynamics/post/adaptive-cards-outlook-approvals/ Wed, 06 Jan 2021 07:50:00 +0000 https://aric.isite.dev/index.php/2021/01/06/approval-process-with-outlook-adaptive-cards/ To start the year, I would like to review Adaptive Cards, where last summer I had a Power Storm session with a couple of my fellow MVPs, Alex Shlega and Linn Zaw Win to get a little more familiarized with using Adaptive Cards, adding them to Cloud flows and Microsoft Teams, and sending adaptive cards via Outlook. About 5 months passed since then, and now I had the need to build something for a project that I am working on.

The post Approval Process with Outlook Adaptive Cards appeared first on Aric Levin's Digital Transformation Blog.

]]>
To start the year, I would like to review Adaptive Cards, where last summer I had a Power Storm session with a couple of my fellow MVPs, Alex Shlega and Linn Zaw Win to get a little more familiarized with using Adaptive Cards, adding them to Cloud flows and Microsoft Teams, and sending adaptive cards via Outlook. About 5 months passed since then, and now I had the need to build something for a project that I am working on.

You can read about our findings in Alex’s post below.

https://www.itaintboring.com/powerstorm/adaptive-cards-findings-powerstorm-session-summary/

Let’s start with some background. The tables and data in this post have been built only for the purpose of this post. In our case we have a transaction record that has certain information about the details of a particular purchase or acquisition. After the record is created, it is submitted for multiple levels of approval by an analyst, associate, director, legal and finally after all the approvals have been completed it can be completed or published. The screenshot below shows a transaction record with some of the data populated.

Adaptive Cards - Transaction Form (UCI)

As the user requests for Approvals, a flow is executed, and based on the Action Type (Submission, Approval or Rejection), the team that this is sent to (Analyst Team, Associate Team, Director Team or Legal Team), the flow executes and processes the request. The screen below shows the different stages that this approval is going through.

Adaptive Cards - Transaction Stages (UCI)

Now let’s briefly look at the flow that we created. For testing purposes, I created a flow with a manual trigger that accepts three input parameters the Action Type, the Next Stage (or Status) and the Transaction Unique Identifier. For the actual implementation I have a transition entity that specifies what action can be performed on a particular stage, and what the subsequent stages can be. For example if I am in Associate Review, the two available types of Actions are Approve and Reject. If I approve, the subsequent stage will be Director Review, and if I reject, the subsequent status will move back to Analyst Review.

The flow starts by querying a few different entities and getting some parameters from our Dataverse instance. We start by querying the System User entity and getting the System User Id, and then querying the String Maps and getting the Action Type Code (as shown in the screenshot below)

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Get User and Action Type

The Expression for the System User is: first(outputs(‘List_Systemuser_records’)?[‘body/value’])?[‘systemuserid’]

The Expression for the String Map is:
first(body(‘List_action_type_records’)?[‘value’])?[‘attributevalue’]

Next, we need to get the stage id of the next stage, and the unique identifier of the last transaction stage record which needs to be updated with the action and the stage.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Get Stages and Transaction Stages

The Expression for the System User is: first(outputs(‘List_stage_records’)?[‘body/value’])?[‘bas_stageid’]

The Expression for the String Map is:
first(outputs(‘List_unfinished_stage_records’)?[‘body/value’])?[‘bas_transactionstageid’]

Now that we have all the information we can start updating the data in our existing transaction stages and transaction tables.

The first thing that will be updated is the Transaction stage record that is In Progress. This will be updated with the Action Type, the ending stage, the User that completed the action, and the completion date. The screenshot below shows that update.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Update Transaction Stage

Next, we need to create a new transaction stage record. This will contain the stage as the beginning stage of this record and the owner of the new record. The owner will be available in the Stage Transitions table which contains the name of the owning team.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Create Transaction Stage

Finally we will update the transaction entity with the current stage so that the users navigating to the record can see the current status, as well as this allows us to manage security on the record by reassigning the record to a different team that will have editable access.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Update Transaction Record

Now that all of the updates have been completed, I need to get some data from the Transaction record, and related records to be able to populate in the email. We will get all data from the transaction table, and then only the Account Name from Accounts, the Full Name from Contacts and the Full Name from User for the name of the approver.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Get Data for Email

Before we look at the email, let’s take at a few links, that allow us to design the adaptive card.

The Adaptive Card Designer allows you to design cards, not only for Outlook but also for other hosting apps, such as Bot Framework, Microsoft Teams, Cortana and other Windows applications. You can copy the payload of your card to flow from here as well.

https://adaptivecards.io/designer/

The Actionable Message Designer is a tool that allows you to customize the Adaptive Card that will be used for Outlook. You can design it using a visual designer, and then use the JSON Payload to copy into your flow.

https://amdesigner.azurewebsites.net/

Both designers provide a list of predefined samples which you can use when designing your Adaptive Card. In our particular case, we used the Expense Approval sample card from the Actionable Message Designer.

Adaptive Cards - Actionable Message Designer

Now let’s get back to our flow, and see the implementation of the adaptive card.

We enter the To and Subject in the Email Message, and before we paste the body of the email, we have to embed it into an Html page. The following is the required content:

<html>
<head>
<script type=”application/adaptivecard+json”>
<!– The Adaptive Card Content will come here –>
</script>
</head>
</html>

The content of the adaptive card payload is long, so I will show this in an animated gif below. It will contain information about the transaction, the clients and the status history of the transaction.

Adaptive Cards - Microsoft Power Automate - Cloud Flow - Send Email Action

After the flow has been completed, the below email containing the adaptive card will be received.

Adaptive Cards - Result in Email

Now that we have created the adaptive card, there are a few things that we need to keep in mind. There are certain security requirements that will allow you to send an adaptive card to any user. If you are only doing this within your organization, this might be a little easier, but if it’s global, there are additional requirements.

https://docs.microsoft.com/en-us/outlook/actionable-messages/security-requirements

You will need to register an originator for your actionable messages and add that to the script of your actionable card. Publishing is done by adding an account that will have access to send actionable cards:

https://outlook.office.com/connectors/oam/publish

The full solution can be found in my github repository, and hopefully I can create a video of this sometime soon.

https://github.com/ariclevin/AdaptiveCards

The post Approval Process with Outlook Adaptive Cards appeared first on Aric Levin's Digital Transformation Blog.

]]>
Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request http://aric.isite.dev/azure/post/http-request-msgraph-canvas-app-flow/ Mon, 14 Dec 2020 06:28:00 +0000 https://aric.isite.dev/index.php/2020/12/14/calling-ms-graph-api-from-canvas-apps-by-using-power-automate-and-http-request/ Recently, while working on some requirements, I noticed that one of the solutions that the company implemented was to replicate the Azure Groups and Members from AD into their Dataverse environment. This seemed to me to be unnecessary, but sometimes due to security restrictions, this might be the only way.

The post Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently, while working on some requirements, I noticed that one of the solutions that the company implemented was to replicate the Azure Groups and Members from AD into their Dataverse environment. This seemed to me to be unnecessary, but sometimes due to security restrictions, this might be the only way.

After further investigation, the only requirement from the business was to check whether particular users belonged to groups, and there was no immediate requirement of having the AD Members stored in our Dataverse environment, especially due to the fact that we would have to continuously sync between AD and Dataverse.

I offered the business an alternative. What if you had an application where you could specify the name of the group and it would show you all of the users that belong to it, or even better, specify the user and it will let you know all the group the user belongs to. This seemed to be like a no-brainer, and from our point of view an easy solution especially since we finally got access to use the Graph API (for Users and Groups only).

There are other alternatives to this of course, but this was going to work for us, especially since individual users did not have access to Graph API, but we had an App Registration with Client Id and Secret.

The following section briefly explains how to set up permissions to Azure Graph Api. Login to Azure and click on App Registrations. You will need to set up the API permissions and the Client Certificate, and finally copy the information so that you can use it within your flow.

Once you get into the New App Registration and given it a name, click on the Api Permissions, and select Microsoft Graph, and choose the Application type (and not delegated). You will need to add two sets pf permissions: Group.Read.All and User.Read.All, and then make sure that you grant consent, as these permissions require admin consent.

Microsoft Graph - App Registration - Api Permissions

Next, set up the Client Secret. Click on the Certificates & secrets, select the option to add a new Client Secret. You can set the Client secret to expire in 1 year, 2 years or to never expire. After you have created the Client Secret, copy it into notepad or another program. You will need this for later. Once you leave the App Registration, you will not be able to retrieve the Client Secret, so make sure that you store it for later use.

Microsoft Graph - App Registration - Client Secret

Now that you are done, go back to the Overview page of your app registration. You will need to copy the Application (client) ID and the Directory (tenant) ID, the same way you copied the Client Secret before. The following image shows the information on the Overview page.

Microsoft Graph - App Registration - Overview

Since I don’t really like designing stuff, and prefer to take predesigned templates, I took that Org Browser Canvas App template that is available from the Create App page.

The app contains more features then what I was looking for, so I removed it to a minimal so that it just contains the home screen and search screen .

At the end, I had two screens. Let’s quickly go over these. I named the app AD Search. My home screen contains the title and logo, and two buttons: User Search and Group Search, which both redirect to the Search Screen after Setting the parameter action type either Users or Groups.

The View my profile at the bottom is still in progress. I have not yet decided what to include there.

Microsoft Graph - Canvas App - Home Screen

When the Search Screen loads, it clears any previous search results from the results collection, so it is always a new search by calling the Clear on the ADSearchResults collection.

The form displays a search control, and when the search text is entered, and the search icon is clicked, it calls Power Automate flows to retrieve the user matching the email address or the groups matching the display name of the group.

The following screenshots shows both scenarios.

Microsoft Graph - Canvas App - Search Screen

If we look at the source for the search icon OnSelect function, it will show us that we are adding the results from the GraphUserSearch flow or GraphGroupSearch flow into a new collection called ADUserResults.

If (actionType = "Users", 
ClearCollect(ADUserResults, GraphUserSearch.Run(txtInputSearchUser.Text, actionType)), 
ClearCollect(ADUserResults, GraphGroupSearch.Run(txtInputSearchUser.Text, actionType))) 

The Gallery Items points to ADUserResults, and then we show the Initial, DisplayName and Title of each person in the results of each gallery item.

Now, let’s look at the logic for Power Automate, but before in case anyone is not aware, I would like to introduce Graph Explorer which can help out with configure Graph Api requests. The Graph Explorer can be accessed via: https://developer.microsoft.com/en-us/graph/graph-explorer.

Both flows start the same way, and we can combine both of them into a single flow, but I split them for simplifying this article. Our trigger for this flow is Power Apps, and then we initialize for variables of type string. These variables are the Search String (containing the value of the search string from the Canvas App), the Action Type (containing the action from the Canvas App, which can be Users, Employees, Groups or the type of search that we will be performing), the Query String and the Search Results (containing the placeholder for the results). The image below illustrates this.

Microsoft Graph - Power Automate Flow - Trigger and Init Actions

The next part is we set the variable Query String. This will contain the Graph Api query string that will be called, as shown in the image below.

Microsoft Graph - Power Automate Flow - Query String Variable

We can take that same query string and test it out in Graph Explorer to make sure that it works before adding it to the flow. Next, we need to call the Api, using a GET request and passing the Query string that we specified in the URI parameter. We add a contentType header with a value of application/JSON, as our results will be in JSON format.

We need to provide the authentication method to retrieve the results. As we created an App Registration using a Client Secret, we will use Active Directory OAuth. This is where we will need to information that I previously mentioned you should write down.

We will provide the Directory (Tenant) Id, the Audience, the Application (Client) Id and the Client Secret. The image below illustrates the HTTP request.

Microsoft Graph - Power Automate - HTTP Request

Finally, we need to store the results in the variable we instantiated earlier (called Search Results), and then pass the Response back to the Canvas App using the Response action (of the Request Connector).

Microsoft Graph - Power Automate - Search Results and Response

The value that is entered in the SearchResults variable is the value of the body of the previous step or:
body(‘HTTP_Graph_Users’)?[‘value’]

We enter that value in the Body of the response. We also need to specify the Response Body JSON Schema, which will contain the elements that will be returned to the Canvas App. The sample below shows this text.

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "id": {
                "type": "string"
            },
            "displayName": {
                "type": "string"
            },
            "givenName": {
                "type": "string"
            },
            "surname": {
                "type": "string"
            },
            "mail": {
                "type": "string"
            },
            "jobTitle": {
                "type": "string"
            },
            "userPrincipalName": {
                "type": "string"
            }
        },
        "required": [
            "id",
            "displayName",
            "givenName",
            "surname",
            "mail",
            "jobTitle",
            "userPrincipalName"
        ]
    }
}

When we want to use the same logic for Querying the Groups, the flow is similar but there are a few options that are changed. After the initialization of the variables, we need to first query the Graph Api to get the Id of the group that we are querying, and only then can we get the members of the group. This flow contains two calls to the Api. The image below illustrates the calls to the Api.

Microsoft Graph - Power Automate - Group Flow

The solution files has been posted to my github repository:
ariclevin/MSGraph (github.com)

A You Tube video is available as well:
https://youtu.be/DqqpDmdaVxc

Special shoutout to Elaiza Benitez for her episode on Microsoft Graph API authentication on What the Flow:
How to authenticate as an application with Microsoft Graph API (youtu.be)

The post Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request appeared first on Aric Levin's Digital Transformation Blog.

]]>
Service Principal Connection References and using Invoker’s Connection http://aric.isite.dev/powerapps/post/connection-reference-service-principal-invoker/ Mon, 30 Nov 2020 19:00:00 +0000 https://aric.isite.dev/index.php/2020/11/30/service-principal-connection-references-and-using-invokers-connection/ As Microsoft is still making changes to connection references, I have been trying to get it to work with a service principal account, and account a few interesting points that are important for anyone who will be developing and deploying Power Automate flows using the Service Principal Connection reference.

The post Service Principal Connection References and using Invoker’s Connection appeared first on Aric Levin's Digital Transformation Blog.

]]>
As Microsoft is still making changes to connection references, I have been trying to get it to work with a service principal account, and account a few interesting points that are important for anyone who will be developing and deploying Power Automate flows using the Service Principal Connection reference.

Let’s start with the basics. I created a flow that will trigger when the Payment Status column of a transaction record is updated to the value of Sent. Once that is done, I retrieve the existing value from the associated Contact record, increment the total transaction amount and then update the total transaction amount on the Contact record.

Let’s start with the trigger. I create an update trigger on the Transactions table (custom), set the filtering attributes to the column that I want it to trigger on, and finally set the Run as triggering user.

Connection Reference Service Principal - Run As Triggering User

I have also set the Connection Reference to use the Service Principal account that I used. Of course there are no issues so far.

Next, I add logic to retrieve the associated contact record (I only need the transaction amount column), check if the value is null and set the value of the Total Transaction Amount in the variable. If the contact record has existing value, I will also increment the variable.

Connection Reference - Service Principal - Flow Conditions

Finally, I will call an Update action on the contact record, and set the Total Transaction Amount column to the value of the variable that I set in the previous step. The final step looks like the image below.

Connection Reference - Service Principal - Update Action

Now for the test. I ran the test two times, and with two different results based on some changes to the action that updates the contact record. The list below shows the two records that were updated in the transaction table:

Connection Reference - Service Principal - Active Transactions View

The first time that I ran the process, I used the Service Principal account that I created.

Connection Reference - Service Principal - Invoker's Connection not selected

The results in this situation were that the record that was updated showed that it was modified by the Service Principal account, and not be the user that was set as the Triggering user of the flow. I guess this makes sense, as I should be able to tell the system in each action whether I want it to be executed by the triggering user, or by the Service account itself.

Connection Reference - Service Principal - Updated Contacts View showing Modified By Service Principal

Since I don’t really want to show that this was executed and updated by the Service Principal account, I needed to see what has to be modified in order to get this working. If we take a look at the more options for the Update record action, we see that there is an option to Use invoker’s connection. This will basically execute this action as the same user account that was set on the trigger of the flow. The image below shows the Invoker’s connection set.

Connection Reference - Service Principal - Invoker's Connection is selected

Once I changed the second transaction Payment Status to sent, I could see that the second updated record contains the correct modified by value, which is the triggering user.

Connection Reference - Service Principal - Updated Contacts View showing Modified By Triggering User

The next step is to deploy this to a higher environment, as part of the solution. As we already know, when we deploy across environments, and we have flows that are using connection references, we have to provide the connection reference that the flow will use in the higher environment. In that case, we will need to create a new connection reference in each environment and when importing the solution, set it accordingly.

The one thing to notice, is that as connection references are still in preview, there is currently a limit of how many flows can use a single connection reference. At the time of writing this, that limit is 16. I am pretty sure that by the time that connection references are out of preview, that limit will change.

I will update this as we find out new changes on connection references.

The post Service Principal Connection References and using Invoker’s Connection appeared first on Aric Levin's Digital Transformation Blog.

]]>
Considerations of Embedding Canvas Apps in Model Driven forms – Revisited http://aric.isite.dev/dynamics/post/embedded-canvas-app-in-uci-revisited/ Sun, 15 Nov 2020 10:19:00 +0000 https://aric.isite.dev/index.php/2020/11/15/considerations-of-embedding-canvas-apps-in-model-driven-forms-revisited/ A few weeks ago, I wrote a blog post about embedding a Canvas App in a Model Driven form, and compared the options of whether, in my case I should have embedded the app using the Canvas App control, or as we implemented using an iframe on the form.

The post Considerations of Embedding Canvas Apps in Model Driven forms – Revisited appeared first on Aric Levin's Digital Transformation Blog.

]]>
A few weeks ago, I wrote a blog post about embedding a Canvas App in a Model Driven form, and compared the options of whether, in my case I should have embedded the app using the Canvas App control, or as we implemented using an iframe on the form.

Although, while making this decision to use the iframe it seemed like the correct one to do, as we would be able to control the security of the Canvas app by passing a parameter from the Model-driven form into the app, that ended up being a crash and burn situation as we needed this to be available on both Desktop and Mobile applications. At the time of writing this article, Canvas apps embedded in iframes are not accessible on mobile devices.

Even Scott Durow’s Ribbon Workbench, using Smart buttons has the ability to open up a Canvas App in a new modal window, but that too has limitations as it will not work in mobile devices due to the Cross Domain authentication restrictions that are blocked by the Power Apps mobile application. You can vote for Scott’s support request by clicking on the link below:

https://powerusers.microsoft.com/t5/Power-Apps-Ideas/Support-Canvas-App-modal-popup-inside-Model-Driven-Mobile-App/idi-p/704962

So, now that we had this issue, we had to find a way to resolve it. As I mentioned in my previous post, we had a complex security model, which included the sharing of records, but also a custom implemented tab level security which needed to determine whether a read-only mode or a read-write mode of this app would be accessible to the end user. All of this logic was executed via JavaScript, so it was very easy to pass a parameter, but that wasn’t accessible via mobile device.

We consulted with Microsoft, and one approach was to get all the entities that make up that security model and write logic within our Canvas app to deal with that security model. That would be an overkill.

The other approach which we thought of, was of course that Power Apps can call Power Automate flows. We want to see if this could be done, and after thorough testing, we determined this would be a good choice. We needed to write the JavaScript logic that would test tab level security into our flow, we needed additional security to check whether the user the is logged in is the owner of the record or a member of any of the teams, we need to check if the user had the correct sharing privileges via principal object access, which would also give us the status of the record was read-only or not.

Most of the flow handled the tab level security, but the problem that we had was that we could not query the principal object access entity, and it is not available on the list of entities within of CDS.

Although, I was eager to start “playing” with Custom APIs, I reverted to using custom actions and plugin code to handle the Principal Object Access security. For some organizations that you work with, preview and new features require approvals.

Embedded Canvas App - Power Automate Flow calling Custom Action

So, I created a custom action to check whether user had the correct sharing (Write Access) and whether the user was an owner (or member of owner team) of the record. Since the accessrightsmask return value enumeration would return the sum of all possible access rights that are being shared with ta user, I used a BitArray in order to get the value of Write Access (2). The link below shows the write access types on the Microsoft Docs site.

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/web-api/accessrights?view=dynamics-ce-odata-9

Once I created the functions and tested the logic, I passed the two parameters back from the Custom Action of whether the user is the owner and the user has access.

The code below shows the code that would check if the principal (user or team) has the correct access in the principalobjectaccess entity.

private bool principalHasAccess(Guid principalId, Guid objectId, int position) {
  bool hasAccess = false;

  QueryExpression query = new QueryExpression("principalobjectaccess");
  query.ColumnSet.AddColumns("accessrightsmask");

  query.Criteria.AddCondition("principalid", ConditionOperator.Equal, principalId);
  query.Criteria.AddCondition("objectid", ConditionOperator.Equal, objectId);

  EntityCollection results = Context.SystemOrgService.RetrieveMultiple(query);
  if (results.Entities.Count > 0) {
    foreach(Entity poa in results.Entities) {
      int accessRightsMask = poa.GetAttributeValue < int > ("accessrightsmask");
      BitArray accessRights = new BitArray(new int[] { accessRightsMask });
      hasAccess = accessRights.Get(position);
    }
  }
  return hasAccess;
}

Once the flow was completed, the only thing that was left to do was call it from the Canvas app, and set the return variable to whether the App should run as read-only and read-write.

The image below shows the temporary screenshot that was displayed to the user while the security flow was being executed

Embedded Canvas App - App calling flow

Now that we had to go through all this functionality to implement this, my thoughts were that has to be a way that we can pass parameters between a Model-driven app and a Canvas app in a supported way. Unfortunately, none that I am aware of (or my Microsoft contacts).

I created a new idea in the Power Apps forum, so if you think that this is something that would be helpful to you in future implementations, please vote up.

https://powerusers.microsoft.com/t5/Power-Apps-Ideas/Pass-parameters-from-Model-driven-for-to-embedded-Canvas-app/idi-p/746134

UPDATE: My friend and fellow MVP, Alex Shlega just posted a link on quering the Principal Access Object from within Power Automate flows. Check it out if you are looking for that part of the solution. Thank you Alex:
https://www.itaintboring.com/dynamics-crm/how-to-verify-principle-object-access-directly-from-the-flow/

The post Considerations of Embedding Canvas Apps in Model Driven forms – Revisited appeared first on Aric Levin's Digital Transformation Blog.

]]>
Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data http://aric.isite.dev/dynamics/post/cds-azuresql-azurefunc-connector/ Mon, 14 Sep 2020 00:30:00 +0000 https://aric.isite.dev/index.php/2020/09/14/update-data-in-your-cds-using-azure-sql-server-azure-functions-or-connectors-from-sql-data/ Recently, while working on a project that needed to update the Exchange rates in CDS, I was tasked at finding a solution that would be able to retrieve data from a SQL Server hosted on an Azure Virtual Machine. There were so many different approaches, and security was the main one, but I decided to do a deep dive and testing out how Power Automate will be able to accommodate these requests.

The post Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently, while working on a project that needed to update the Exchange rates in CDS, I was tasked at finding a solution that would be able to retrieve data from a SQL Server hosted on an Azure Virtual Machine. There were so many different approaches, and security was the main one, but I decided to do a deep dive and testing out how Power Automate will be able to accommodate these requests.

The first thing of course was the creation of the SQL Server database and add the data to it. I created a table in SQL Server that would be updated on a regular basis containing the Exchange Rates that I needed. For the purpose of this post, I used Azure SQL Server, and not a managed instance or a SQL Server hosted on Azure. The image below show the table and values that were added to the SQL Server database.

Exchange Rates Solution - SQL DB Design

You can see in the image that we store the Exchange Rate Code, The Exchange Rate Date (in Date format and string format) and the Exchange Rate value on that particular date.

Next, I created an entity in CDS to store that information. I called the new entity Transaction, and added fields of the Transaction Amount, Exchange Rate and the Transaction Amount in US Dollars. When adding a new record to the Transaction table we will only add the Currency and the Transaction Amount.

The first test that was to be performed was to create a Power Automate flow, that would get triggered on the creation of a new Transaction record, retrieve the Code Name of the currency, and then call Azure SQL Server (using Username and Password connection), to get the rows that correspond to the query that we provided.

I then initialized a CurrentDate variable that would contain today’s today in the format of the Date String that I created in the database. The formula for that date was:
formatDateTime(utcNow(), ‘yyyyMMdd’)

I used the Get rows (V2) action, adding a Filter Query which would specify a condition where the Exchange Rate Code is the code I previously retrieved and the Date is the current date, which was set in the previous step. The Select Query would return the value of the Exchange Rate value that I needed.

The image below shows the steps to get here.

Exchange Rate Solution - Power Automate Flow #1

Next, although I know I will have a single result, I still did an Apply to Each as I used the Get Rows for my SQL Server connection. I add the value property to the Apply to each action, and then add a Compose Step to calculate the amount in US dollars. The calculation here is slightly more complex, but still considerably easy if you know the names of your actions and items. This uses the mul to multiple two values, and conversion of each of the values to float to allow this multiplication to happen.

mul(float(items(‘Apply_to_each_3’)?[‘ExRateValue’]), float(triggerOutputs()?[‘body/nyc_transactionamount’]))

The final step here is to update the Transaction record and passing the ExchangeRateValue from the SQL query results and the Transaction Amount in US dollars. The image below shows the logic that is built in the Apply to each section.

Exchange Rate Solution - Power Automate Flow #2

Next we save the flow, and we need to test this logic. In order to test this, we navigate to our Model-driven app, and provide the Currency and the Transaction amount, as shown in the below image. As the flow was created to trigger on create of a new transaction record or update of the total amount, this will execute. The image below shows the information that was entered in the record before the flow executed. You will notice that the transaction number, transaction exchange rate and transaction amount (USD) are left empty.

Exchange Rates Solution - New Transaction record (CDS)

You can wait approximately 5-10 seconds and then be able to see the results in the flow as shown below:

Exchange Rates Solution - Completed Transaction record (CDS)

Now that we saw that we could do this by connecting to SQL Server directly, I was going to test a few other alternatives. I created an Azure function that would connect to the SQL Server and retrieve the same value for me. The main purpose is that this Azure function would be available across the board, even if you did not have access to SQL Server. You will need to add the System.Data.SqlClient nuget package to your Azure function project and to the header of your file. Your function code will expect two values as part of a query string, the currency code and the formatted date. The code is not fully optimized, but you can see the code below:

Exchange Rate Solution - Azure Api function (Code)

Now that we created the Azure function, let’s take a look at how the modified Power Automate flow will look like. The trigger and the first three action will be the same. Instead of the Get Rows, we would have an HTTP request, and we would pass the formatted URL, containing the code and the date parameters that we need to pass to the Azure function. We would use a compose action, similarly to what we did before to multiply the result from the Azure function (Exchange Rate) and the total amount of the base currency: mul(float(body(‘HTTP’)), float(triggerOutputs()?[‘body/nyc_transactionamount’]))

Finally we would update the record in CDS with the values of the Exchange Rate and the result of the multiplication. The image below shows the final flow:

Exchange Rate Solution - Power Automate flow calling Azure Api function

I ran the same tests in CDS and the results were the same, only using an API, so that it could be called from anywhere.

Now that I saw that this works, I wanted to add another option to this. I build a custom connector that would call the Azure API, so that it could be accessed directly from flow. This worked nice as well. The image below shows the test that I ran within the connector builder, and can be added to the Power Automate flow easily.

Exchange Rate Solution - Custom Connector

Although this entire process is doable, and somewhat cumbersome as you have to update your Exchange Rates on a daily basis and make sure that the data in your database is synched, there are APIs that are available for use that will allow you to just call a custom API and retrieve the data that you are looking for.

One of these solutions is called XE Currency Data, and for $800/year you can get 10,000 API requests per month. Additional requests are available of course at a higher price. I have not reviewed other vendors of a similar api, but this seems to be something that should be readily available and maybe even available for free for smaller request numbers.

The post Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data appeared first on Aric Levin's Digital Transformation Blog.

]]>
Microsoft Forms Pro is becoming Dynamics 365 Customer Voice http://aric.isite.dev/dynamics/post/forms-pro-becoming-customer-voice/ Wed, 02 Sep 2020 05:49:00 +0000 https://aric.isite.dev/index.php/2020/09/02/microsoft-forms-pro-is-becoming-dynamics-365-customer-voice/ In late July, during Inspire 2020 event, Microsoft made an announcement about the upcoming changes to their forms and survey application (Forms Pro), and rebranding this as a completely new product Dynamics 365 Customer Voice.

The post Microsoft Forms Pro is becoming Dynamics 365 Customer Voice appeared first on Aric Levin's Digital Transformation Blog.

]]>
In late July, during Inspire 2020 event, Microsoft made an announcement about the upcoming changes to their forms and survey application (Forms Pro), and rebranding this as a completely new product Dynamics 365 Customer Voice.

The announcement after Inspire was published in the Dynamics 365 blogs:

https://cloudblogs.microsoft.com/dynamics365/bdm/2020/07/22/democratizing-customer-feedback-with-the-new-dynamics-365-customer-voice/

Depending on where in the world your tenant is, this push would have happened sometime during August, and if you try to access Forms Pro using the forms.office.com site, you would get a notification that Forms Pro is not Dynamics 365 Customer Voice, and that you should be accessing your surveys there. It also mentions that they will no longer be available on the forms web site after 9/30/2020.

Forms Pro to Dynamics 365 Customer Voice - Transition Message

When you click on Open Customer Voice, you will see a complete new interface (and url). The home page will provide with a New Project or Get Started button that you can start creating a new Survey either from a template or from blank. Out of the box at this time there are 4 available templates which cover a customer service survey, delivery survey, service visit survey and a support survey.

Forms Pro to Dynamics 365 Customer Voice - New Project

After you select your survey, you will be asked which environment you want to connect this survey with, and the survey will be automatically created, with a predefined list of questions. You can remove any of the predefined questions or add new questions, similar to they way this was done in Forms Pro. The navigation and the customization of the Survey have been redesigned and revamped though.

The left pane, or main navigation pane will contain the list of surveys that are associated with your current project. This means that you can have a single project with multiple surveys within it, as well as the Satisfaction metrics and the outcome of your survey, which will show you the responses from the survey.

The right pane is your customization pane. This allows you to customize parts of your survey with branching, variables, languages, branding, formatting and satisfaction metrics. Clicking on any one of these links will show the settings for that particular area of customization.

Forms Pro to Dynamics 365 Customer Voice - Customization Pane

Once you have done working on your survey, you can easily send the survey out from within Customer Voice. The Send tab provides you various options of sending out the survey via Automation, Email, Embedding, Link or QR Code. The screen below shows the Send tab of the survey. You can personalize your email, use templates and even embed a survey question in the email, add the survey and unsubscribe links or personalize the survey with additional fields based on the uploaded data. You can also add automation for events such as sending surveys when leads are qualified or cases are resolved.

Forms Pro to Dynamics 365 Customer Voice - Send Survey Options

As of the time of writing this, there have not been any changes to the Power Automate Forms Connectors. I am not exactly sure if there is a plan to make changes to them or bring additional features, but they still provide the same functionality for Customer Voice and they did for Forms Pro.

Forms Pro to Dynamics 365 Customer Voice - Power Automate

Hope this gives you some insights as to the changes that are have been or will be deployed into your tenants and that the new changes will really make the use of surveys a lot easier to use.

The post Microsoft Forms Pro is becoming Dynamics 365 Customer Voice appeared first on Aric Levin's Digital Transformation Blog.

]]>