Flow Archives - Aric Levin's Digital Transformation Blog https://aric.isite.dev/tag/flow/ Microsoft Dynamics 365, Power Platform and Azure Wed, 11 May 2022 17:22:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications https://aric.isite.dev/dynamics/post/2021-wave2-uci-in-app-notifications/ Sun, 17 Oct 2021 06:55:00 +0000 https://aric.isite.dev/index.php/2021/10/17/power-platform-2021-release-wave-2-unified-interface-updates-in-app-notifications/ n preparation for our upcoming NYC BizApps event on October 27, 2021, and the Power Platform 2021 Wave 2 release (weekend of October 23/24), I am writing a series of blog posts related to some of the most sought after updates. In this post we will review the new Notifications table and In-App Notifications.

The post Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications appeared first on Aric Levin's Digital Transformation Blog.

]]>
In preparation for our upcoming NYC BizApps event on October 27, 2021, and the Power Platform 2021 Wave 2 release (weekend of October 23/24), I am writing a series of blog posts related to some of the most sought after updates. In this post we will review the new Notifications table and In-App Notifications.

Although the In-App Notifications are documented as part of the Power Platform 2021 Release Wave 1, and was make available for Public preview in July, it is not yet available as part of GA, so I will be reviewing this new feature here.

So, what is In-app notifications. In-app notifications provide us the ability to alert users on certain processes that have been completed or required their attention, and these are displayed in their Model-driven app as part of the new notifications icon (or notification center).

This notifications feature is not available by default in every environment, and require a System Administrator or App Maker to make this feature available within the app. In order to enable this, we need to call the following statement (which can be done from the console of our browser window while running our model driven app.

fetch(window.origin + "/api/data/v9.1/SaveSettingValue()",{

 method: "POST",

   headers: {'Content-Type': 'application/json'},

   body: JSON.stringify({AppUniqueName: "ENTER_APP_UNIQUE_NAME", SettingName:"AllowNotificationsEarlyAccess", Value: "true"})

   });

An example of the Unique App Name would be: crde4_MyApp. Once this feature is enabled we will be able to use the Notifications table to display notifications to end users.

Now that we configured our environment, let’s go ahead and create a notification. There are a few required parameters (or recommended) that we need to add in order to display the notification, such as the title, owner and body. The owner of the notification is the user that this notification will be displayed for. There are additional options such as Icon Type, Toast Type Expiration and Data which is a JSON string that is used for extensibility and parsing richer data into the notification. The screenshot below shows you the properties

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Notification Table Properties

You can read the Microsoft blog on Notification, by clicking on the link below:
https://docs.microsoft.com/en-us/powerapps/developer/model-driven-apps/clientapi/send-in-app-notifications#notification-table

Most of the sample in the above link are visible via JavaScript. Let’s demonstrate how this would look using Power Automate flow.

In our example we will create a flow that displays a notification when a new account is created. This is a simple flow, without any additional parameters, or configuration of the JSON string.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Basic Flow

Let’s go ahead and create the record. As this logic does not require to create a lot of data, I will just enter some minimal data. The only parameters that the flow is actually using is the name of the account and the creator. The image below shows the record that we created.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Create New Account to trigger flow

Once we create the new record, we will receive a notification in the Notification Center or as a toast that the account has been created as shown in the image below.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Basic Notification in Notification Center

Now, sometimes the toast notifications are not immediate, and we might have saved the record and closed it, so we would like to have a link back to the record. So let’s go ahead and first modify the flow and see how this works in action. We will add the Data element to create our custom JSON string, so that the user can access the record that was created. The image below shows the changes to the Add New row of the flow to enable a click-through, so that you can open the record that was created.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Flow with JSON Data/Action

Once the flow executes, the notification will be displayed to the end user, with a link to navigate to the correct record. Note that sometimes notifications don’t appear immediately and there is a slight delay, but as mentioned previously, this is still in preview.

Power Platform 2021 Release Wave 2 Unified Interface - Notifications - Notification with Action in Notification Center

Additional posts related to the Dynamics 365 and Power Platform 2021 Release Wave 2 will be posted by following the link below:

Power Platform 2021 Wave 2 Release Posts

The post Power Platform 2021 Release Wave 2 Unified Interface Updates – In-App Notifications appeared first on Aric Levin's Digital Transformation Blog.

]]>
Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request https://aric.isite.dev/azure/post/http-request-msgraph-canvas-app-flow/ Mon, 14 Dec 2020 06:28:00 +0000 https://aric.isite.dev/index.php/2020/12/14/calling-ms-graph-api-from-canvas-apps-by-using-power-automate-and-http-request/ Recently, while working on some requirements, I noticed that one of the solutions that the company implemented was to replicate the Azure Groups and Members from AD into their Dataverse environment. This seemed to me to be unnecessary, but sometimes due to security restrictions, this might be the only way.

The post Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently, while working on some requirements, I noticed that one of the solutions that the company implemented was to replicate the Azure Groups and Members from AD into their Dataverse environment. This seemed to me to be unnecessary, but sometimes due to security restrictions, this might be the only way.

After further investigation, the only requirement from the business was to check whether particular users belonged to groups, and there was no immediate requirement of having the AD Members stored in our Dataverse environment, especially due to the fact that we would have to continuously sync between AD and Dataverse.

I offered the business an alternative. What if you had an application where you could specify the name of the group and it would show you all of the users that belong to it, or even better, specify the user and it will let you know all the group the user belongs to. This seemed to be like a no-brainer, and from our point of view an easy solution especially since we finally got access to use the Graph API (for Users and Groups only).

There are other alternatives to this of course, but this was going to work for us, especially since individual users did not have access to Graph API, but we had an App Registration with Client Id and Secret.

The following section briefly explains how to set up permissions to Azure Graph Api. Login to Azure and click on App Registrations. You will need to set up the API permissions and the Client Certificate, and finally copy the information so that you can use it within your flow.

Once you get into the New App Registration and given it a name, click on the Api Permissions, and select Microsoft Graph, and choose the Application type (and not delegated). You will need to add two sets pf permissions: Group.Read.All and User.Read.All, and then make sure that you grant consent, as these permissions require admin consent.

Microsoft Graph - App Registration - Api Permissions

Next, set up the Client Secret. Click on the Certificates & secrets, select the option to add a new Client Secret. You can set the Client secret to expire in 1 year, 2 years or to never expire. After you have created the Client Secret, copy it into notepad or another program. You will need this for later. Once you leave the App Registration, you will not be able to retrieve the Client Secret, so make sure that you store it for later use.

Microsoft Graph - App Registration - Client Secret

Now that you are done, go back to the Overview page of your app registration. You will need to copy the Application (client) ID and the Directory (tenant) ID, the same way you copied the Client Secret before. The following image shows the information on the Overview page.

Microsoft Graph - App Registration - Overview

Since I don’t really like designing stuff, and prefer to take predesigned templates, I took that Org Browser Canvas App template that is available from the Create App page.

The app contains more features then what I was looking for, so I removed it to a minimal so that it just contains the home screen and search screen .

At the end, I had two screens. Let’s quickly go over these. I named the app AD Search. My home screen contains the title and logo, and two buttons: User Search and Group Search, which both redirect to the Search Screen after Setting the parameter action type either Users or Groups.

The View my profile at the bottom is still in progress. I have not yet decided what to include there.

Microsoft Graph - Canvas App - Home Screen

When the Search Screen loads, it clears any previous search results from the results collection, so it is always a new search by calling the Clear on the ADSearchResults collection.

The form displays a search control, and when the search text is entered, and the search icon is clicked, it calls Power Automate flows to retrieve the user matching the email address or the groups matching the display name of the group.

The following screenshots shows both scenarios.

Microsoft Graph - Canvas App - Search Screen

If we look at the source for the search icon OnSelect function, it will show us that we are adding the results from the GraphUserSearch flow or GraphGroupSearch flow into a new collection called ADUserResults.

If (actionType = "Users", 
ClearCollect(ADUserResults, GraphUserSearch.Run(txtInputSearchUser.Text, actionType)), 
ClearCollect(ADUserResults, GraphGroupSearch.Run(txtInputSearchUser.Text, actionType))) 

The Gallery Items points to ADUserResults, and then we show the Initial, DisplayName and Title of each person in the results of each gallery item.

Now, let’s look at the logic for Power Automate, but before in case anyone is not aware, I would like to introduce Graph Explorer which can help out with configure Graph Api requests. The Graph Explorer can be accessed via: https://developer.microsoft.com/en-us/graph/graph-explorer.

Both flows start the same way, and we can combine both of them into a single flow, but I split them for simplifying this article. Our trigger for this flow is Power Apps, and then we initialize for variables of type string. These variables are the Search String (containing the value of the search string from the Canvas App), the Action Type (containing the action from the Canvas App, which can be Users, Employees, Groups or the type of search that we will be performing), the Query String and the Search Results (containing the placeholder for the results). The image below illustrates this.

Microsoft Graph - Power Automate Flow - Trigger and Init Actions

The next part is we set the variable Query String. This will contain the Graph Api query string that will be called, as shown in the image below.

Microsoft Graph - Power Automate Flow - Query String Variable

We can take that same query string and test it out in Graph Explorer to make sure that it works before adding it to the flow. Next, we need to call the Api, using a GET request and passing the Query string that we specified in the URI parameter. We add a contentType header with a value of application/JSON, as our results will be in JSON format.

We need to provide the authentication method to retrieve the results. As we created an App Registration using a Client Secret, we will use Active Directory OAuth. This is where we will need to information that I previously mentioned you should write down.

We will provide the Directory (Tenant) Id, the Audience, the Application (Client) Id and the Client Secret. The image below illustrates the HTTP request.

Microsoft Graph - Power Automate - HTTP Request

Finally, we need to store the results in the variable we instantiated earlier (called Search Results), and then pass the Response back to the Canvas App using the Response action (of the Request Connector).

Microsoft Graph - Power Automate - Search Results and Response

The value that is entered in the SearchResults variable is the value of the body of the previous step or:
body(‘HTTP_Graph_Users’)?[‘value’]

We enter that value in the Body of the response. We also need to specify the Response Body JSON Schema, which will contain the elements that will be returned to the Canvas App. The sample below shows this text.

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "id": {
                "type": "string"
            },
            "displayName": {
                "type": "string"
            },
            "givenName": {
                "type": "string"
            },
            "surname": {
                "type": "string"
            },
            "mail": {
                "type": "string"
            },
            "jobTitle": {
                "type": "string"
            },
            "userPrincipalName": {
                "type": "string"
            }
        },
        "required": [
            "id",
            "displayName",
            "givenName",
            "surname",
            "mail",
            "jobTitle",
            "userPrincipalName"
        ]
    }
}

When we want to use the same logic for Querying the Groups, the flow is similar but there are a few options that are changed. After the initialization of the variables, we need to first query the Graph Api to get the Id of the group that we are querying, and only then can we get the members of the group. This flow contains two calls to the Api. The image below illustrates the calls to the Api.

Microsoft Graph - Power Automate - Group Flow

The solution files has been posted to my github repository:
ariclevin/MSGraph (github.com)

A You Tube video is available as well:
https://youtu.be/DqqpDmdaVxc

Special shoutout to Elaiza Benitez for her episode on Microsoft Graph API authentication on What the Flow:
How to authenticate as an application with Microsoft Graph API (youtu.be)

The post Calling MS Graph API from Canvas Apps by using Power Automate and HTTP Request appeared first on Aric Levin's Digital Transformation Blog.

]]>
Service Principal Connection References and using Invoker’s Connection https://aric.isite.dev/powerapps/post/connection-reference-service-principal-invoker/ Mon, 30 Nov 2020 19:00:00 +0000 https://aric.isite.dev/index.php/2020/11/30/service-principal-connection-references-and-using-invokers-connection/ As Microsoft is still making changes to connection references, I have been trying to get it to work with a service principal account, and account a few interesting points that are important for anyone who will be developing and deploying Power Automate flows using the Service Principal Connection reference.

The post Service Principal Connection References and using Invoker’s Connection appeared first on Aric Levin's Digital Transformation Blog.

]]>
As Microsoft is still making changes to connection references, I have been trying to get it to work with a service principal account, and account a few interesting points that are important for anyone who will be developing and deploying Power Automate flows using the Service Principal Connection reference.

Let’s start with the basics. I created a flow that will trigger when the Payment Status column of a transaction record is updated to the value of Sent. Once that is done, I retrieve the existing value from the associated Contact record, increment the total transaction amount and then update the total transaction amount on the Contact record.

Let’s start with the trigger. I create an update trigger on the Transactions table (custom), set the filtering attributes to the column that I want it to trigger on, and finally set the Run as triggering user.

Connection Reference Service Principal - Run As Triggering User

I have also set the Connection Reference to use the Service Principal account that I used. Of course there are no issues so far.

Next, I add logic to retrieve the associated contact record (I only need the transaction amount column), check if the value is null and set the value of the Total Transaction Amount in the variable. If the contact record has existing value, I will also increment the variable.

Connection Reference - Service Principal - Flow Conditions

Finally, I will call an Update action on the contact record, and set the Total Transaction Amount column to the value of the variable that I set in the previous step. The final step looks like the image below.

Connection Reference - Service Principal - Update Action

Now for the test. I ran the test two times, and with two different results based on some changes to the action that updates the contact record. The list below shows the two records that were updated in the transaction table:

Connection Reference - Service Principal - Active Transactions View

The first time that I ran the process, I used the Service Principal account that I created.

Connection Reference - Service Principal - Invoker's Connection not selected

The results in this situation were that the record that was updated showed that it was modified by the Service Principal account, and not be the user that was set as the Triggering user of the flow. I guess this makes sense, as I should be able to tell the system in each action whether I want it to be executed by the triggering user, or by the Service account itself.

Connection Reference - Service Principal - Updated Contacts View showing Modified By Service Principal

Since I don’t really want to show that this was executed and updated by the Service Principal account, I needed to see what has to be modified in order to get this working. If we take a look at the more options for the Update record action, we see that there is an option to Use invoker’s connection. This will basically execute this action as the same user account that was set on the trigger of the flow. The image below shows the Invoker’s connection set.

Connection Reference - Service Principal - Invoker's Connection is selected

Once I changed the second transaction Payment Status to sent, I could see that the second updated record contains the correct modified by value, which is the triggering user.

Connection Reference - Service Principal - Updated Contacts View showing Modified By Triggering User

The next step is to deploy this to a higher environment, as part of the solution. As we already know, when we deploy across environments, and we have flows that are using connection references, we have to provide the connection reference that the flow will use in the higher environment. In that case, we will need to create a new connection reference in each environment and when importing the solution, set it accordingly.

The one thing to notice, is that as connection references are still in preview, there is currently a limit of how many flows can use a single connection reference. At the time of writing this, that limit is 16. I am pretty sure that by the time that connection references are out of preview, that limit will change.

I will update this as we find out new changes on connection references.

The post Service Principal Connection References and using Invoker’s Connection appeared first on Aric Levin's Digital Transformation Blog.

]]>
Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data https://aric.isite.dev/dynamics/post/cds-azuresql-azurefunc-connector/ Mon, 14 Sep 2020 00:30:00 +0000 https://aric.isite.dev/index.php/2020/09/14/update-data-in-your-cds-using-azure-sql-server-azure-functions-or-connectors-from-sql-data/ Recently, while working on a project that needed to update the Exchange rates in CDS, I was tasked at finding a solution that would be able to retrieve data from a SQL Server hosted on an Azure Virtual Machine. There were so many different approaches, and security was the main one, but I decided to do a deep dive and testing out how Power Automate will be able to accommodate these requests.

The post Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently, while working on a project that needed to update the Exchange rates in CDS, I was tasked at finding a solution that would be able to retrieve data from a SQL Server hosted on an Azure Virtual Machine. There were so many different approaches, and security was the main one, but I decided to do a deep dive and testing out how Power Automate will be able to accommodate these requests.

The first thing of course was the creation of the SQL Server database and add the data to it. I created a table in SQL Server that would be updated on a regular basis containing the Exchange Rates that I needed. For the purpose of this post, I used Azure SQL Server, and not a managed instance or a SQL Server hosted on Azure. The image below show the table and values that were added to the SQL Server database.

Exchange Rates Solution - SQL DB Design

You can see in the image that we store the Exchange Rate Code, The Exchange Rate Date (in Date format and string format) and the Exchange Rate value on that particular date.

Next, I created an entity in CDS to store that information. I called the new entity Transaction, and added fields of the Transaction Amount, Exchange Rate and the Transaction Amount in US Dollars. When adding a new record to the Transaction table we will only add the Currency and the Transaction Amount.

The first test that was to be performed was to create a Power Automate flow, that would get triggered on the creation of a new Transaction record, retrieve the Code Name of the currency, and then call Azure SQL Server (using Username and Password connection), to get the rows that correspond to the query that we provided.

I then initialized a CurrentDate variable that would contain today’s today in the format of the Date String that I created in the database. The formula for that date was:
formatDateTime(utcNow(), ‘yyyyMMdd’)

I used the Get rows (V2) action, adding a Filter Query which would specify a condition where the Exchange Rate Code is the code I previously retrieved and the Date is the current date, which was set in the previous step. The Select Query would return the value of the Exchange Rate value that I needed.

The image below shows the steps to get here.

Exchange Rate Solution - Power Automate Flow #1

Next, although I know I will have a single result, I still did an Apply to Each as I used the Get Rows for my SQL Server connection. I add the value property to the Apply to each action, and then add a Compose Step to calculate the amount in US dollars. The calculation here is slightly more complex, but still considerably easy if you know the names of your actions and items. This uses the mul to multiple two values, and conversion of each of the values to float to allow this multiplication to happen.

mul(float(items(‘Apply_to_each_3’)?[‘ExRateValue’]), float(triggerOutputs()?[‘body/nyc_transactionamount’]))

The final step here is to update the Transaction record and passing the ExchangeRateValue from the SQL query results and the Transaction Amount in US dollars. The image below shows the logic that is built in the Apply to each section.

Exchange Rate Solution - Power Automate Flow #2

Next we save the flow, and we need to test this logic. In order to test this, we navigate to our Model-driven app, and provide the Currency and the Transaction amount, as shown in the below image. As the flow was created to trigger on create of a new transaction record or update of the total amount, this will execute. The image below shows the information that was entered in the record before the flow executed. You will notice that the transaction number, transaction exchange rate and transaction amount (USD) are left empty.

Exchange Rates Solution - New Transaction record (CDS)

You can wait approximately 5-10 seconds and then be able to see the results in the flow as shown below:

Exchange Rates Solution - Completed Transaction record (CDS)

Now that we saw that we could do this by connecting to SQL Server directly, I was going to test a few other alternatives. I created an Azure function that would connect to the SQL Server and retrieve the same value for me. The main purpose is that this Azure function would be available across the board, even if you did not have access to SQL Server. You will need to add the System.Data.SqlClient nuget package to your Azure function project and to the header of your file. Your function code will expect two values as part of a query string, the currency code and the formatted date. The code is not fully optimized, but you can see the code below:

Exchange Rate Solution - Azure Api function (Code)

Now that we created the Azure function, let’s take a look at how the modified Power Automate flow will look like. The trigger and the first three action will be the same. Instead of the Get Rows, we would have an HTTP request, and we would pass the formatted URL, containing the code and the date parameters that we need to pass to the Azure function. We would use a compose action, similarly to what we did before to multiply the result from the Azure function (Exchange Rate) and the total amount of the base currency: mul(float(body(‘HTTP’)), float(triggerOutputs()?[‘body/nyc_transactionamount’]))

Finally we would update the record in CDS with the values of the Exchange Rate and the result of the multiplication. The image below shows the final flow:

Exchange Rate Solution - Power Automate flow calling Azure Api function

I ran the same tests in CDS and the results were the same, only using an API, so that it could be called from anywhere.

Now that I saw that this works, I wanted to add another option to this. I build a custom connector that would call the Azure API, so that it could be accessed directly from flow. This worked nice as well. The image below shows the test that I ran within the connector builder, and can be added to the Power Automate flow easily.

Exchange Rate Solution - Custom Connector

Although this entire process is doable, and somewhat cumbersome as you have to update your Exchange Rates on a daily basis and make sure that the data in your database is synched, there are APIs that are available for use that will allow you to just call a custom API and retrieve the data that you are looking for.

One of these solutions is called XE Currency Data, and for $800/year you can get 10,000 API requests per month. Additional requests are available of course at a higher price. I have not reviewed other vendors of a similar api, but this seems to be something that should be readily available and maybe even available for free for smaller request numbers.

The post Update Data in your CDS using Azure SQL Server, Azure Functions or Connectors from SQL Data appeared first on Aric Levin's Digital Transformation Blog.

]]>
Microsoft Forms Pro is becoming Dynamics 365 Customer Voice https://aric.isite.dev/dynamics/post/forms-pro-becoming-customer-voice/ Wed, 02 Sep 2020 05:49:00 +0000 https://aric.isite.dev/index.php/2020/09/02/microsoft-forms-pro-is-becoming-dynamics-365-customer-voice/ In late July, during Inspire 2020 event, Microsoft made an announcement about the upcoming changes to their forms and survey application (Forms Pro), and rebranding this as a completely new product Dynamics 365 Customer Voice.

The post Microsoft Forms Pro is becoming Dynamics 365 Customer Voice appeared first on Aric Levin's Digital Transformation Blog.

]]>
In late July, during Inspire 2020 event, Microsoft made an announcement about the upcoming changes to their forms and survey application (Forms Pro), and rebranding this as a completely new product Dynamics 365 Customer Voice.

The announcement after Inspire was published in the Dynamics 365 blogs:

https://cloudblogs.microsoft.com/dynamics365/bdm/2020/07/22/democratizing-customer-feedback-with-the-new-dynamics-365-customer-voice/

Depending on where in the world your tenant is, this push would have happened sometime during August, and if you try to access Forms Pro using the forms.office.com site, you would get a notification that Forms Pro is not Dynamics 365 Customer Voice, and that you should be accessing your surveys there. It also mentions that they will no longer be available on the forms web site after 9/30/2020.

Forms Pro to Dynamics 365 Customer Voice - Transition Message

When you click on Open Customer Voice, you will see a complete new interface (and url). The home page will provide with a New Project or Get Started button that you can start creating a new Survey either from a template or from blank. Out of the box at this time there are 4 available templates which cover a customer service survey, delivery survey, service visit survey and a support survey.

Forms Pro to Dynamics 365 Customer Voice - New Project

After you select your survey, you will be asked which environment you want to connect this survey with, and the survey will be automatically created, with a predefined list of questions. You can remove any of the predefined questions or add new questions, similar to they way this was done in Forms Pro. The navigation and the customization of the Survey have been redesigned and revamped though.

The left pane, or main navigation pane will contain the list of surveys that are associated with your current project. This means that you can have a single project with multiple surveys within it, as well as the Satisfaction metrics and the outcome of your survey, which will show you the responses from the survey.

The right pane is your customization pane. This allows you to customize parts of your survey with branching, variables, languages, branding, formatting and satisfaction metrics. Clicking on any one of these links will show the settings for that particular area of customization.

Forms Pro to Dynamics 365 Customer Voice - Customization Pane

Once you have done working on your survey, you can easily send the survey out from within Customer Voice. The Send tab provides you various options of sending out the survey via Automation, Email, Embedding, Link or QR Code. The screen below shows the Send tab of the survey. You can personalize your email, use templates and even embed a survey question in the email, add the survey and unsubscribe links or personalize the survey with additional fields based on the uploaded data. You can also add automation for events such as sending surveys when leads are qualified or cases are resolved.

Forms Pro to Dynamics 365 Customer Voice - Send Survey Options

As of the time of writing this, there have not been any changes to the Power Automate Forms Connectors. I am not exactly sure if there is a plan to make changes to them or bring additional features, but they still provide the same functionality for Customer Voice and they did for Forms Pro.

Forms Pro to Dynamics 365 Customer Voice - Power Automate

Hope this gives you some insights as to the changes that are have been or will be deployed into your tenants and that the new changes will really make the use of surveys a lot easier to use.

The post Microsoft Forms Pro is becoming Dynamics 365 Customer Voice appeared first on Aric Levin's Digital Transformation Blog.

]]>
Sending Emails to all customers using Power Automate https://aric.isite.dev/dynamics/post/send-bulk-email-flow-to-all-customers/ Sun, 26 Jul 2020 06:17:00 +0000 https://aric.isite.dev/index.php/2020/07/26/sending-emails-to-all-customers-using-power-automate/ Recently in the forums, there were a few questions on how to send emails to all of my customers. While there might be different options, and the right solution to this might greatly depend on the frequency that you need to be sending this as well as the number of contacts, I thought I would demonstrate a few different approaches to implement this using Microsoft Power Automate.

The post Sending Emails to all customers using Power Automate appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently in the forums, there were a few questions on how to send emails to all of my customers. While there might be different options, and the right solution to this might greatly depend on the frequency that you need to be sending this as well as the number of contacts, I thought I would demonstrate a few different approaches to implement this using Microsoft Power Automate.

This post will cover the following three different scenarios mentioned below:

  • Send Emails to the Primary Contact a selected group of Accounts
  • Send Email to All Contacts associated with an Account
  • Send a scheduled email to all Accounts

When sending the email you also have different choices of Sending the email.

You can send the email using the Outlook 365 Connector, or the Common Data Service Connector. When sending the email using the Common Data Service (Current Environment) Connector you can create the email message and send it, or you can use an email template to send the email message.

The main thing that we would want to consider when sending email to a mass amount of customers is the limits that we have on performing Power Automate actions and limits of Bulk email against Exchange Server. See the links at the bottom of the post on Power Platform and Exchange limits.

Scenario 1: Send Email to Primary Contact of a selected group of Accounts

For the first scenario, sending an email to the Primary Contact for a selected group of accounts, I created a flow with a trigger of when a record is selected, and added an Input for Outage Date. This flow will send a notification to all selected accounts for a coming outage. There are two additional actions, which are to get the Contact record of the Primary Contact in order to be able to retrieve Contact Information and the email address, and a final step that will send the email. The image below shows how the Flow will look.

Send Email to all customers using Power Automate

Once the flow has been completed we will be able to see the results. The two screenshots below show the calling of the flow and providing the outage date, and then the flow run results as well as the email message that was generated.

Send Email to all customers using Power Automate
Send Email to all customers using Power Automate

Scenario 2: Send Email to All Contacts associated with an Account

For the second scenario, I will be sending an email to all the associated contacts to a particular account. We will use a similar trigger for the selection of the account, but in this case, we are not going to required any input.

The first thing that we need to do is retrieve all of the Contact records that are associated with that account. We will use the List records action of the CDS (Current Environment) connect to retrieve all the associated contacts. I am using a fetchXml in this case to create the linked entity.

Next, we will use an apply to each control action, and create the email message and send it. We will specify the subject, description, regarding and the To Party of the email.

Finally, we will call the Bound action Send Email on the Email Message entity. The image below shows the final result of the Power Automate flow.

Send Email to all customers using Power Automate

Once the flow has been completed we will be able to see the results. Since this flow was using the Email Messages entity for sending of the email, we will see the results in CRM. The four screenshots below show the calling of the flow, and then the flow run results as well as the email messages that were sent and the content of an individual email message.

Send Email to all customers using Power Automate

Send Email to all customers using Power Automate

Send Email to all customers using Power Automate

Send Email to all customers using Power Automate
Scenario 3: Send Scheduled Email to Primary Contact associated with all Account records

In our third scenario we will create a scheduled flow that will execute every month, retrieve the primary contact for each account and send an email message to that contact containing maintenance notices.

We start this by adding a recurrence trigger that will execute every month. You can change the schedule as you need. Run it on a daily, weekly or monthly basis, or if this needs to execute every few hours, that is fine too. We then call the List records CDS action to retrieve account records. I used a query this time with the expand option to retrieve the Account and Contact record in one run, so that I don’t have to call a second Get record action for each contact.

Finally I add an Apply to each Control action that will Send an email to each individual record. The image below shows the finalized flow for the last scenario.

Send Scheduled Email to all customers using Power Automate

To test the final scenario, I performed a manual execution of the flow, by selecting Test Flow, and the selecting the option of I’ll perform the trigger action. In the Run flow window, we just click on the Run flow button to execute the flow. The screenshot below shows the execution of the flow, and the 10 email messages that were sent out.

Send Scheduled Email to all customers using Power Automate

There isn’t one option that is going to work for everyone, but in this simple post, I demonstrated three different options where you can select one or a combination of options to send email to your customers. Remember the limitations that I mentioned earlier, which are shown below.

Hope this helps resolves anyone’s issues with sending bulk emails to their customers.

For information about the Power Automate and Power Platform request limits, see the Request limits and allocation document here.

Power Platform request limits and allocations

For information about Sending limits using Office 365 Plans, see the Exchange Online limits documentation and review the Sending limits section:

Exchange Online limits

The post Sending Emails to all customers using Power Automate appeared first on Aric Levin's Digital Transformation Blog.

]]>
The Road to modern Virus Scanning https://aric.isite.dev/azure/post/road-virus-scan/ Thu, 02 Jul 2020 05:34:00 +0000 https://aric.isite.dev/index.php/2020/07/02/the-road-to-modern-virus-scanning/ I have been working in the Government space for a few years now, and most implementations of the Dynamics and Azure tenants and environments are hosted in the Government Cloud. This means that there are a lot of restrictions that we have to deal with, not only from Microsoft but also from the internal IT policies.

The post The Road to modern Virus Scanning appeared first on Aric Levin's Digital Transformation Blog.

]]>
I have been working in the Government space for a few years now, and most implementations of the Dynamics and Azure tenants and environments are hosted in the Government Cloud. This means that there are a lot of restrictions that we have to deal with, not only from Microsoft but also from the internal IT policies.

A few years back we launched our first Dynamics application where one of the requirements was the ability to scan files that were uploaded by end users, whether from the Dynamics application or from Dynamics Portals. These documents would be uploaded to an Azure Blob Storage Container, and as they were uploaded copies to a separate quarantine container until they would be scanned.

At the time our options for a Virus Scan solution were limited. We had an On-Premise Virus Scanning McAffee appliance that was available to us, and we ended up with a solution that would check every few minutes if there were pending quarantined uploads, we would scan them and then move them from the quarantine container back to the clean container.

The below diagram is the high level solution that was implemented.

Virus Scanning Solution - Scheduled Run

This solution worked fine when the traffic was not high, but we did experience at time of high traffic that it would not complete processing the files in the allocated time, and needed a separate solution. The heavy traffic was mostly experienced in the last few months during the COVID-19 pandemic where the amount of applications that we received were substantial higher.

We needed to find a solution to have a quicker turnaround. We have used the Azure Service Bus in previous projects to pass information between our Dynamics environment and our On-Premise servers, so this should work. We would change the process to handle this in real time. As the file is uploaded to our Azure Storage Container, we would immediately fire the Azure Service Bus.

I have written a few posts about Azure Service Bus in the past, so if you are interested in that implementation, click here.

This solution would call the Azure Service Bus listener as soon as the file is uploaded and sent to the Virus Scanner. We could also bypass the need of the quarantine container immediately and only send it there after the Scan if the file was infected. The diagram below shows the new solution.

Virus Scan Solution - Real Time (Azure Service Bus)

As I mentioned that we are in GCC and there are a lot of limitations both from the list of available connectors and the implementations that can be done, but I wanted to address this as if it was done in a Commercial Cloud.

I noticed a couple of weeks ago that Microsoft announced the availability of a new Virus Scanner connector called Virus Total. I was not aware of other options but when I did some searching I encountered the availability of three connectors that have the capabilities of scanning documents (or document streams): Virus Total, Cloudmersive Virus Scan and Microsoft Defender ATP. This was great, it would simplify this logic.

Regardless of which Virus Scanner you are using, you will need to get an API key from the vendor of the Virus Scanning connector in order to establish a connector. Depending on your load of scanning the your cost can be free or cost you some money. I think most of these vendors offer about 4000 free scans a month.

If you are using Dynamics Portals or Power Apps Portals, you can upload your documents to either an Azure Blob Storage Container or SharePoint. The following flow executes when a new filter is uploaded to a SharePoint folder, scans the file for Virus and creates a CDS record with the Status of a Successful or Unsuccessful scan. Let’s review this step by step.

The first part is going to be our trigger. When a new document is uploaded to SharePoint (or Azure Blob) the flow will be triggered to get the content of that document. In case of SharePoint, the single step will provide us with the Content of the Document. If using Azure Blob, we will need an additional step to get the content of the blob based on the path of the file.

Virus Scan Solutiion - Power Automate Trigger (SharePoint or Azure Blob)

Next, we will call the Scan file for Viruses. In this case we used the action from Cloudmersive, but any of the connectors should work just fine for this.

Virus Scan Solution - Scanning via ISV Connector

After the scanning is complete we will add a condition to our flow that looks for the result from the Scan. The CleanResult will return True if there are no Viruses and False otherwise. We can then determine what action we want to do. Delete the file, move to quarantine container or folder, write a record, etc… Your choice. In our case I just wrote it to CDS.

Virus Scan Solution - Post Scanning

That is basically it. You can add additional logic as you see needed, but this is all it takes. Again if you are in Government Cloud or your IT is blocking certain connectors this might not be the solution for you, but if you are able to implement this, it might save you a lot of trouble and headaches.

The post The Road to modern Virus Scanning appeared first on Aric Levin's Digital Transformation Blog.

]]>
Creating Outlook Events from a OneDrive Excel spreadsheet https://aric.isite.dev/sharepoint/post/outlook-events-onedrive-excel/ Sun, 28 Jun 2020 06:28:00 +0000 https://aric.isite.dev/index.php/2020/06/28/creating-outlook-events-from-a-onedrive-excel-spreadsheet/ Last Friday, while working with one of our clients on an integration issue between Dynamics and the Microsoft Exchange global address book, he asked me if there was a way for them to have automated creation of events (calendar items) in their Microsoft Outlook calendar where the source of the data was an Excel spreadsheet in OneDrive. This is a short post, but might help someone out who needs to implement this.

The post Creating Outlook Events from a OneDrive Excel spreadsheet appeared first on Aric Levin's Digital Transformation Blog.

]]>
Last Friday, while working with one of our clients on an integration issue between Dynamics and the Microsoft Exchange global address book, he asked me if there was a way for them to have automated creation of events (calendar items) in their Microsoft Outlook calendar where the source of the data was an Excel spreadsheet in OneDrive. This is a short post, but might help someone out who needs to implement this.

My first thoughts were Server Side Sync of course, but there was no reason for these appointments to be in Dynamics. Well, seconds later my reaction was. This should be easy enough using the Excel and Outlook connectors. I started testing a few scenarios and within about 30 minutes, we had a solution that was ready to test. What better solution then creating a Power Automate Flow.

I created a manual flow first just to test this logic, but after that modified this to a scheduled flow.

The first step was to create an Excel spreadsheet with the data elements that we needed. I created a spreadsheet and a table within the spreadsheet and called it EventsTable.

Power Automate Flow - Outlook Events

This table can naturally grow or shrink based on the data that is needed.

Next was to create the first action in the flow (after the Manual Trigger, which we later replaced with schedule). We used the List rows present in a table action using the Excel Online (Business) connector.

Power Automate Flow - Outlook Events - List rows from table

Specified the Location, Document Library, Filename and the name of the table.

The next thing is looping through all the rows of the table, and creating Events from these. I added an Apply to each control and added an Outlook 365 Create event (V4) action to create the event record inside of Outlook 365.

Power Automate Flow - Outlook Events - Apply to each Create event

That is basically it. I ran the flow to test it out, and within seconds I could see the three events (calendar appointments within my Outlook Web App).

Power Automate Flow - Outlook Events - Outlook calendar results

The only thing left to do is change the manual trigger. I just went ahead and deleted the manual trigger and added the Recurrence (Schedule) trigger, and set this to run on a daily basis.

Power Automate Flow - Outlook Events - Scheduled Flow

We can add additional conditions to check if the event is already there or update the Excel spreadsheet to make that an event has been created, but will leave that for future requirements.

Thank you Cristian for asking how to get this done…

The post Creating Outlook Events from a OneDrive Excel spreadsheet appeared first on Aric Levin's Digital Transformation Blog.

]]>
Changing record statuses in UCI and 2020 Wave 1 release https://aric.isite.dev/dynamics/post/change-status-2020-wave-1/ Wed, 13 May 2020 09:00:00 +0000 https://aric.isite.dev/index.php/2020/05/13/changing-record-statuses-in-uci-and-2020-wave-1-release/ When working with model driven apps, we have the ability to navigate through the flow of the record using the Business Process Flow. The BPF is a great way to navigate through the different stages of the record, but what if that is just not enough. What if in order to complete the record I have to go through 10 or maybe more stages.

The post Changing record statuses in UCI and 2020 Wave 1 release appeared first on Aric Levin's Digital Transformation Blog.

]]>
When working with model driven apps, we have the ability to navigate through the flow of the record using the Business Process Flow. The BPF is a great way to navigate through the different stages of the record, but what if that is just not enough. What if in order to complete the record I have to go through 10 or maybe more stages.

In previous work that I did, we implement logic that allowed us to navigate by displaying a popup window on the form (using alert.js), and we called the logic inside an HTML eb resource. A lot of these options are being deprecated and we will be prevented from calling webapi calls using HTML web resources.

With the 2020 Wave 1 Release we can now incorporate the Xrm.Navigation.navigateTo command as well as the Xrm.Utility.lookupObjects to provide us similar functionality.

Our use case will be as follows. When a CDS user works on a record and wants to change the status of the record, they will be able to change the status only to statuses that are available (not to all statuses), and the system will log the history of status changes. There are various ways of implanting this, and I chose in this particular case the one that most fits our business model.

Let’s first start by creating the data elements that we need. We will use three entities, called Statuses, Status Mappings and Status History.

The Status entity will just contain a list of statuses with no particular logic, and will contain a single field called Status name as shown in the image below. I set the Ownership of this entity to Organization.

Dynamics CRM 2020 Wave 1 - Change Status

The Status Mapping entity will contain the mappings of the entity, and the abilities to navigate from one Status to another. This means that I will have the current status and the next status so that I can query this entity. Since I used this for applications, I also have an application type record. This entity can be set to have either type of ownership. If there are no security restrictions, then we can set to be organization owned, and if you need to implement Security or Sharing, you can set this to be User/Team owned. The image below shows the required fields for this entity.

Dynamics CRM 2020 Wave 1 - Status Mapping Entity

The last entity is the Status History. We also have an application entity which we will discuss later, but, that can be any entity that you desire. The Status Mapping entity will contain a history of your statuses as you move from one Status to another. This entity will contain a lookup to the parent entity (in our case Application), and a lookup to the Status entity. The ownership for this entity will be User/Team. The image below will show the basic fields for this entity.

Dynamics CRM 2020 Wave 1 - Status History Entity

In our application entity, we will also create a lookup to the Status entity, so that we can store the current status. This field will be read only so that users cannot make any changes to it directly, but only by going through the Change Status process.

The next thing that we will do is on the parent entity, in our case the Application entity, we will add a ribbon button called Change Status that will call a function in JavaScript to start the execution of this process. The image below shows the Change Status button in our Command bar on the application entity.

Dynamics CRM 2020 Wave 1 - Change Status Command Bar

Using Ribbon Workbench, we create the button, add a Command that will use a Custom Javascript Action passing the Crm Parameter PrimaryControl, which will allow us to get the execution context for the function. We set the Display Rule as a Form State Rule to show everywhere except on Create.

The image below shows the configuration that we set in Ribbon Workbench. Thank your Scott Durow.

Dynamics CRM 2020 Wave 1 - Change Status Ribbon Workbench

Next we will start working on the Change Status Javascript function in the application entity. This function is straight forward, but it took a little bit of trial and error to get the exact results that I was looking for.

Within the function I retrieve the current status and the application type and call the Xrm.Navigation.navigateTo function to pop up a new record window when the button is clicked, passing the application id, application type and current status as parameters.

The code below shows the change status function

 function changeStatus(context)  
 {  
   var entityId = context.data.entity.getId();  
   
   var currentStatus = context.getAttribute("bac_applicationstatusid").getValue();  
   var currentStatusId = currentStatus[0].id;  
   
   var applicationType = context.getAttribute("bac_applicationtypeid").getValue();  
   var applicationTypeId = applicationType[0].id;  
   
   var entityFormOptions = {};  
   entityFormOptions["entityName"] = "bac_statushistory";  
   entityFormOptions["useQuickCreateForm"] = true;  
     
   // Set default values for the Contact form  
   var formParameters = {};  
   formParameters["bac_applicationid"] = entityId;  
   formParameters["bac_applicationidname"] = entityName;  
   formParameters["bac_applicationidtype"] = "bac_application";  
   formParameters["bac_name"] = applicationTypeId + ":" + currentStatusId;  
   
   Xrm.Navigation.navigateTo({  
       pageType: "entityrecord",   
       entityName: "bac_statushistory",  
       data: formParameters  
     },   
     {  
       target: 2, //2 - to open record in modal dialog  
       position: 2, //1 - dialog in center, 2 - side panel  
       width: {  
         value: 30,   
         unit:"%"  
       }  
   }).then(function(result) {  
     var reference = result.savedEntityReference;  
   });  
 }  

Within the Status History form I add three functions, an onLoad function, a statusOnChange function and the setLookupOptions function.

The OnLoad function disables the name and status fields on the form, as these should not be modified by the user directly because we want to limit the available options that the user can select from. It verifies that this is a create form, and then calls the statusOnChange function.

 function onFormLoad(executionContext) {  
   var formContext = executionContext.getFormContext();  
   var formType = formContext.ui.getFormType();  
   
   formContext.getControl("bac_status").setDisabled(true);  
   formContext.getControl("bac_name").setDisabled(true);  
   
   if (formType == 1)  
     statusOnChange(executionContext);  
 }  

The statusOnChange function gets the Application Type and the Status from the parameters that were passed to the form (and temporarily placed in the name attribute control), and uses the Xrm.WebApi.retrieveMultipleRecords function to get all of the available “next” statuses for that type of application and that current status, and puts them in a Statuses array as key/value pair elements.

Once all elements have been added to the Statuses array, the function will call the setLookupOptions function.

 function statusOnChange(executionContext)  
 {  
   var formContext = executionContext.getFormContext();  
   var nameControl = formContext.getControl("bac_name");  
   
   var name = nameControl.getAttribute().getValue();  
   nameControl.getAttribute().setValue("");  
   var inputParams = name.split(":");  
   var currentTypeId = inputParams[0];  
   var currentStatusId = inputParams[1];  
   
   // Call Web Api to get   
   var statuses = [];  
   var filter = "_bac_applicationtypeid_value eq " + currentTypeId + " and _bac_currentstatusid_value eq " + currentStatusId + " or _bac_currentstatusid_value eq null and statecode eq 0";  
   Xrm.WebApi.retrieveMultipleRecords("bac_statusmapping", "?$select=bac_name,_bac_nextstatusid_value&$filter=" + filter).then(  
     function success(result) {  
       for (var i = 0; i < result.entities.length; i++) {  
         var statusId = result.entities[i]["_bac_nextstatusid_value"];  
         var statusName = result.entities[i]["_bac_nextstatusid_value@OData.Community.Display.V1.FormattedValue"];  
         statuses.push({id: statusId,name: statusName});  
       }            
       // perform additional operations on retrieved records  
       setLookupOptions(formContext, statuses);  
     },  
     function (error) {  
       console.log(error.message);  
       // handle error conditions  
     }  
   );  
 }  
   

The setLookupOptions function uses Xrm.Utility.lookupObjects to populate in a Lookup window only the statuses that are available for the user to select. We set the lookup options to not allow multiple selections, disable the most recent items, and filter to show the correct statuses. When the user makes a selection that will be populate on the form (displayed in the panel). The user will then have a choice to enter a comment for that status if required.

 function setLookupOptions (formContext, statuses)  
 {  
   var lookupOptions = {};  
   lookupOptions.allowMultiSelect = false;  
   lookupOptions.disableMru = true;  
   lookupOptions.defaultEntityType = "bac_status";  
   lookupOptions.entityTypes = ["bac_status"];  
   
   var fetchXml = "<filter type='or'>"  
   statuses.forEach(function(status) {  
     fetchXml += "<condition attribute='bac_statusname' operator='eq' value='" + status.name + "' />"  
   });  
   fetchXml += "</filter>";  
   
   lookupOptions.filters = [{  
           filterXml: fetchXml,  
           entityLogicalName: "bac_status"  
      }];  
   
   Xrm.Utility.lookupObjects(lookupOptions)  
   .then(function(result)  
   {  
     if (result != undefined && result.length > 0)  
     {  
       var selectedItem = result[0];  
       formContext.getControl("bac_status").setDisabled(false);  
       formContext.getControl("bac_name").setDisabled(false);  
       
       formContext.getAttribute("bac_status").setValue([{ entityType: selectedItem.entityType, id: selectedItem.id, name: selectedItem.name }]);  
       formContext.getAttribute("bac_name").setValue(selectedItem.name);  
   
       formContext.getControl("bac_status").setDisabled(true);  
       formContext.getControl("bac_name").setDisabled(true);  
   
     }  
   }, function(error)  
   {  
     alert(error.message);  
   });  
 }  

After the function is saved, a Flow is executed on the Create record trigger to update the Status field on the parent application record.

The dynamics image below shows the process of how this looks after all the implementation is complete.

Change Status Demonstration

This particular logic can be customized for any business need when the Business Process Flow is not enough, when you need to implement security on who can make changes of statuses to your records, implement logic for field options and more. This is just the tip of the iceberg on how far this logic can be taken if it makes sense for your organization.

The post Changing record statuses in UCI and 2020 Wave 1 release appeared first on Aric Levin's Digital Transformation Blog.

]]>
New Features in Form Processing for AI Builder https://aric.isite.dev/business-solutions/post/ai-builder-form-processing-new-features/ Tue, 05 May 2020 08:30:00 +0000 https://aric.isite.dev/index.php/2020/05/05/new-features-in-form-processing-for-ai-builder/ Over the past year, I played here and there with AI Builder, and particularly form processing, but have found that it was somewhat cumbersome for some of the forms that I wanted to build, especially when working a lot in the government space. Yesterday, May 4th, Microsoft announced some new changes to the AI Builder Form processing that allows recognition of undetected fields. I decided to test this out, and in order to implement this, used the IRS W-9 form, which is not such as easy form to implement.

The post New Features in Form Processing for AI Builder appeared first on Aric Levin's Digital Transformation Blog.

]]>
Over the past year, I played here and there with AI Builder, and particularly form processing, but have found that it was somewhat cumbersome for some of the forms that I wanted to build, especially when working a lot in the government space. Yesterday, May 4th, Microsoft announced some new changes to the AI Builder Form processing that allows recognition of undetected fields. I decided to test this out, and in order to implement this, used the IRS W-9 form, which is not such as easy form to implement.

First thing is first, login to your Power Apps environment, and in the left navigation under AI Builder click on Build. This will show you the available models that you can use. In our case we will use the Form Processing Model which was recently updated.

AI Builder Start Screen

To get started we will provide a name to the AI model, and click on create. Notice that we will be required to provide 5 documents (or more) with the same layout.

AI Builder Form Processing

The AI Builder Model wizard will start and ask us to upload the documents. You can click on the Add documents button on the command bar or on the screen to start.

AI Builder - Add Documents

You can add your documents from local storage, SharePoint or an Azure Blob Storage, something that was not available in earlier releases.

AI Builder - Add Documents from Local Storage, SharePoint or Azure Storage Container Blob

In my case, I had 6 documents of the IRS W-9 form that I prepared earlier available at local storage.

Upload Documents

The wizard will then show you the list of documents that you selected. Make sure that you select at least 6 documents for this purpose. Click on the Upload documents button

Upload Documents

You will then see that the documents have been uploaded. Click on the Close button to continue with the wizard and analyze the document structure.

AI Builder Upload Documents Successfully

Next you will thumbnails of all the documents that you uploaded. To start analyzing the documents click on Analyze, or if you need to add additional documents, click on the Add documents button in the command bar.

AI Builder Start Analyzing Documents

A modal window will pop up showing that the process is analyzing the documents. Based on the complexity of the documents, the number of pages and the amount of fields this can take up to a few minutes. In my case, since they were all single page with not too many fields, this took a little under a minute.

AI Builder - Analyzing Documents

After the analysis has been completed, the wizard will move to the next phase which is adding the form fields. It will show you that 0 fields have been selected. Click on the thumbnail and then click on the Next button to start selecting the fields

AI Builder - Field Selection

You will be navigated to a page that contains a copy of your form with the fields that AI Builder found. You will need to click on each of the fields that you want as part of the form and add it to the list of selected fields

AI Builder Form Processing Field Selection

You can simply select the field on the form, change the name of the field if you need to, and click on the check mark in the tooltip window

AI Builder Select Fields

Repeat this for all the fields that you want. If there are fields that AI Builder did not detect or select for you, you can select them yourselves (a new feature from yesterday’s release), and add them to the list of fields.

AI Builder Select Fields

Once done accept the fields, which will redirect you to a Save page. Select the form fields that you want to save, and Save your changes.

AI Builder - Form fields selection complete

The next and almost final phases is to train the AI model. Click on the button that will start the training. This might take some time, so you can navigate away and come back later when the training has been completed.

AI Builder Form Processing Training

Once the training has been completed, you can do a test run with a different form, and then publish your changes. When your changes have been published you can use the AI Builder form directly from a Canvas App or Power Automate flow.

AI Builder Form in Power Apps Canvas App

This entire process took less than an hour from start to finish (once I had the forms completed and ready). Now it’s your turn. Find a form that you would like to use AI for, and see how easy it is to get this up and running.

Please review the blog post from Joe Fernandez on the Microsoft Power Apps blog:

https://powerapps.microsoft.com/en-us/blog/ai-builder-form-processing-now-lets-you-recognize-undetected-fields/

The post New Features in Form Processing for AI Builder appeared first on Aric Levin's Digital Transformation Blog.

]]>