Plugins Archives - Aric Levin's Digital Transformation Blog https://aric.isite.dev/tag/plugins/ Microsoft Dynamics 365, Power Platform and Azure Thu, 12 May 2022 03:45:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Using Microsoft Multiselect Lookup in your Model Driven Apps – Part II https://aric.isite.dev/development/post/multiselect-lookup-pcf-overview-update-nn/ Sun, 09 Jan 2022 16:00:00 +0000 https://aric.isite.dev/index.php/2022/01/09/using-microsoft-multiselect-lookup-in-your-model-driven-apps-part-ii/ In the last blog post, I demonstrated how to use the Microsoft Multiselect Lookup control (similar to the Activity Party control), which was released as part of an update to Field Services to add new values to a backend many to many relationship. In this post, I am going to extend this further to show how to add and remove items from the multiselect control, and have them get updated in the relationship.

The post Using Microsoft Multiselect Lookup in your Model Driven Apps – Part II appeared first on Aric Levin's Digital Transformation Blog.

]]>
In the last blog post, I demonstrated how to use the Microsoft Multiselect Lookup control (similar to the Activity Party control), which was released as part of an update to Field Services to add new values to a backend many to many relationship. In this post, I am going to extend this further to show how to add and remove items from the multiselect control, and have them get updated in the relationship.

The post will go through adding existing values to a Multiselect control, and then removing the values and seeing how these values are adding and removed from the control.

Let’s go ahead and start with showing the way this works. The image below shows the control with the existing values, as well as the corresponding values of the JSON string and the text box containing the text values only of the control to be displayed in the view.

Microsoft Multiselect Lookup - N:N Update 1

You will notice that the form has three values of three separate contacts that were previously added on the create of the account record. I will now go ahead and remove two of the values from the form, and you will see that both the Contacts (JSON) string gets updated and the Contact Values text gets updated.

Microsoft Multiselect Lookup - N:N Update - Removal

Finally, let’s go ahead and add one of the contacts back. You will now see that the Contact Values and the Contacts (JSON) string got updated.

Microsoft Multiselect Lookup - N:N Update - Addition

At the end of the post, you will be able to see a video of this in action, including everything that was added in the previous post (of creating a new record). Now let’s jump into the lookup how this was built. This is slightly more complex that the create logic from the previous post as there are various conditions that have to be met.

First let’s take a look at the JSON configuration that will be added to the Plugin step. Note that the plugin step is now different, because it runs on a single attribute. If you have multiple attributes that you need to run this against, then each one will have a separate plugin step. The sample JSON below shows you how to configure this:

[{“_attributeName”:”crde5_contacts”,”_textAttributeName”:”crde5_contactvalues”,”_relationshipType”:”native”,”_relationshipName”:”crde5_accounts_contacts”,”_details”:

{

“_primaryEntityName”:”crde5_accounts_contacts”,

“_relatedAttributeName”:”accountid”,

“_linkToEntityName”:”contact”,

“_linkFromAttributeName”:”contactid”,

“_linkToAttributeName”:”contactid”,

“_linkToPrimaryAttributeName”:”fullname”

}

}]

The first section is the same as for the create, with the exception of the details. I used the same serialization class for both plugins (create and update). The details contain additional parameters which allow me to add a relationship to the related entity to pull values from. This is required to get the existing values that are already in the N:N relationship that is created for this object. You can omit the linked entity part, but I added in order to be able to retrieve the name value from the related entity to the relationship entity.

Next, let’s look at the changes to the serialization class. The LookupAttribute class now contains an additional Data Member called LookupAttributeDetails which is added as a variable and to the class constructor. An additional class of type Lookup attribute details is also created. You can see the code changes below:

[DataContract]
public class LookupAttribute
{
	[DataMember] 
	public string _attributeName { get; set; }
	[DataMember]
	public string _textAttributeName { get; set; }
	[DataMember] 
	public string _relationshipType { get; set; }
	[DataMember] 
	public string _relationshipName { get; set; }
	[DataMember] 
	public LookupAttributeDetails _details { get; set; }

	public LookupAttribute(string attributeName, string textAttributeName, string relationshipType, string relationshipName, LookupAttributeDetails details)
	{
		_attributeName = attributeName;
		_textAttributeName = textAttributeName;
		_relationshipType = relationshipType;
		_relationshipName = relationshipName;
		_details = details;
	}
}

[DataContract]
public class LookupAttributeDetails
{
	[DataMember] 
	public string _primaryEntityName { get; set; }
	[DataMember]
	public string _relatedAttributeName { get; set; }
	[DataMember] 
	public string _linkToEntityName { get; set; }
	[DataMember] 
	public string _linkFromAttributeName { get; set; }
	[DataMember] 
	public string _linkToAttributeName { get; set; }
	[DataMember] 
	public string _linkToPrimaryAttributeName { get; set; }

	public LookupAttributeDetails(string primaryEntityName, string relatedAttributeName, string linkToEntityName, string linkFromAttributeName, string linkToAttributeName, string linkToPrimaryAttributeName)
	{
		_primaryEntityName = primaryEntityName;
		_relatedAttributeName = relatedAttributeName;
		_linkToEntityName = linkToEntityName;
		_linkFromAttributeName = linkFromAttributeName;
		_linkToAttributeName = linkToAttributeName;
		_linkToPrimaryAttributeName = linkToPrimaryAttributeName;
	}
}

Next, let’s look at our Plugin class. Same as the previous Create Plugin we are retrieving the Unsecure and Secure configuration from the Plugin step to be used to populate the lookupAttributes list.

using (MemoryStream stream = new MemoryStream(Encoding.Unicode.GetBytes(_unsecureConfigData)))
{
	DataContractJsonSerializer deserializer = new DataContractJsonSerializer(typeof(List<LookupAttribute>));
	lookupAttributes = (List<LookupAttribute>)deserializer.ReadObject(stream);
}

We then retrieve the data from the actual PCF control, which contains the JSON string of all the contacts that were previously added. This is still similar to the Create Plugin.

string controlData = target.GetAttributeValue<string>(attribute._attributeName);
using (MemoryStream dataStream = new MemoryStream(Encoding.Unicode.GetBytes(controlData)))
{
	DataContractJsonSerializer dataDeserializer = new DataContractJsonSerializer(typeof(List<LookupObject>));
	List<LookupObject> lookupObjects = (List<LookupObject>)dataDeserializer.ReadObject(dataStream);
}

The first difference is that now we want to retrieve the related entity values. We do this by creating a query expression that pulls the data elements based on what we set in the unsecured configuration of the plugin step. This is required so that we can have two list of values (current and previous), and add or remove values to the related entity based on changes in the PCF control. The code below shows the dynamic creation of the query expression, and retrieval of the related values and adding them to the collection of existing objects.

QueryExpression query = new QueryExpression(attribute._relationshipName);
query.ColumnSet.AddColumns(attribute._details._linkFromAttributeName, attribute._details._relatedAttributeName);
query.Criteria.AddCondition(attribute._details._relatedAttributeName, ConditionOperator.Equal, target.Id);
LinkEntity linkEntity = query.AddLink(attribute._details._linkToEntityName, attribute._details._linkFromAttributeName, attribute._details._linkToAttributeName);
linkEntity.EntityAlias = attribute._details._linkToEntityName;
linkEntity.Columns.AddColumn(attribute._details._linkToPrimaryAttributeName);

EntityCollection data = service.RetrieveMultiple(query);
if (data.Entities.Count > 0)
{
	List<LookupObject> existingObjects = new List<LookupObject>();
	foreach (Entity related in data.Entities)
	{
		existingObjects.Add(new LookupObject(
			related.GetAttributeValue<Guid>(attribute._details._linkToAttributeName).ToString(), 
			related.GetAttributeValue<AliasedValue>(attribute._details._linkToEntityName + "." + attribute._details._linkToPrimaryAttributeName).Value.ToString(), 
			attribute._details._linkToEntityName));
	}
}

Now that we have the two pieces of data, and both are of type Lookup Object, we want to make a comparison so that we can determine if to add or remove items them from the related relationship records. The code below created two lists of type Lookup Objects called Items to Add and Items to Remove, and populates them with data when there are elements to add or remove.

List<LookupObject> itemsToAdd = new List<LookupObject>(); 
List<LookupObject> itemsToRemove = new List<LookupObject>(); 

EntityReferenceCollection relatedReferencesToAdd = new EntityReferenceCollection();
foreach (LookupObject item in lookupObjects)
{
	var itemExists = existingObjects.Exists(x => x._id == item._id);
	tracingService.Trace("Item {0} does {1} exist in Related Table", item._id, itemExists.ToString());
	if (!itemExists)
	{
		itemsToAdd.Add(item);
		relatedReferencesToAdd.Add(new EntityReference(item._etn, new Guid(item._id)));
	}
}

EntityReferenceCollection relatedReferencesToRemove = new EntityReferenceCollection();
foreach (LookupObject item in existingObjects)
{
	var itemExists = lookupObjects.Exists(x => x._id == item._id);
	tracingService.Trace("Item {0} does {1} exist in Form Table", item._id, itemExists.ToString());
	if (!itemExists)
	{
		itemsToRemove.Add(item);
		relatedReferencesToRemove.Add(new EntityReference(item._etn, new Guid(item._id)));
	}
}

After adding the items to these collections, we create an Associate Request to add all of the related Items to Add, and create a Disassociate Request to remove all of the related Items that are no longer in the PCF control.

if (itemsToAdd.Count > 0)
{
	AssociateRequest addRequest = new AssociateRequest();
	addRequest.Target = target.ToEntityReference();
	addRequest.Relationship = new Relationship(attribute._relationshipName);
	addRequest.RelatedEntities = relatedReferencesToAdd;
	AssociateResponse addResponse = (AssociateResponse)service.Execute(addRequest);
}

if (itemsToRemove.Count > 0)
{
	DisassociateRequest removeRequest = new DisassociateRequest();
	removeRequest.Target = target.ToEntityReference();
	removeRequest.Relationship = new Relationship(attribute._relationshipName);
	removeRequest.RelatedEntities = relatedReferencesToRemove;
	DisassociateResponse removeResponse = (DisassociateResponse)service.Execute(removeRequest);
}

Finally, same as what we did for the Create Plugin, we are going to update the text value containing the list of Contact names to be displayed on the view.

You can find the code in the following my github repository below

https://github.com/ariclevin/PowerPlatform/tree/master/PCF/MultiSelectLookup

The post Using Microsoft Multiselect Lookup in your Model Driven Apps – Part II appeared first on Aric Levin's Digital Transformation Blog.

]]>
Using Microsoft Multiselect Lookup in your Model Driven Apps https://aric.isite.dev/development/post/multiselect-lookup-pcf-overview/ Mon, 03 Jan 2022 04:41:00 +0000 https://aric.isite.dev/index.php/2022/01/03/using-microsoft-multiselect-lookup-in-your-model-driven-apps/ Earlier last year (September timeframe), Microsoft released a Multiselect Lookup control (similar to the Activity Party control) as part of an update to Field Services. The control does not seem to be available across the entire spectrum yet, but if you have a Field Services License or an Enterprise license, you should be able to use this control across your Dataverse environment.

The post Using Microsoft Multiselect Lookup in your Model Driven Apps appeared first on Aric Levin's Digital Transformation Blog.

]]>
Earlier last year (September timeframe), Microsoft released a Multiselect Lookup control (similar to the Activity Party control) as part of an update to Field Services. The control does not seem to be available across the entire spectrum yet, but if you have a Field Services License or an Enterprise license, you should be able to use this control across your Dataverse environment.

In this post, I will walk through adding the control to the form, saving the record and seeing how we can use some basic plugin logic and configuration to write that data in related entities.

The control is not linked to any of the related entities that you are adding, but only stores a JSON string value to the attribute.

Let’s first go ahead and find the control. If you navigate to you Default solution, and filter by custom controls and do a search for Lookup, you will find the following controls:

Multiselect Lookup - Custom Control in Maker Portal

Next we are going to need to create an attribute (column) in the Maker Portal to host the control. Since the data in the control that is being used is in JSON format, you need to consider the size of the column that you are creating. The format that is being stored in the control contains the Id, the name and the Entity Type Name.

The image below contains two fields that are being used for this control. The first will host the PCF control, while the second will host just the text values of the control in order to be displayed in a view for the users.

Multiselect Lookup - Custom Attributes for configuration

For the purpose of the demo the Contacts column is set to a length of 1000, and the Contacts Value is set to a length of 200.

Now let’s go ahead and add the control to the form. The control is not available in the maker portal, so we will have to use the Classic/Legacy portal to add the control to the form. In the image below we see that the control has been added to the form. We will now go to the properties of the control, and select the Controls tab.

In the controls tab, we will click on the link to Add Control, and then select the control that is called MultiselectLookupControl. We then set the Web, Phone and Tablet options to use the new control instead of the Text Box (default) control.

There are two properties that need to be set in order to use the control. Those are the entity name, which is called entityName_Display_Key and the default view id, called defaultViewId_Display_Key, which can be retrieved from the Home Page views of the Contact table or from the Maker Portal when editing a view. These are displayed as part of the Urls.

Multiselect Lookup - Custom Control Form Configuration

Now that we have added the control, we can test how the control works on the form. Note that we have not implemented any logic for what we want to do once the user adds values to the control.

The images below shows you how the control will look on the form once multiple values have been added, and the data that has been saved.

Multiselect Lookup - Custom Control Form Presentation

Multiselect Lookup - Custom Control View Presentation

Now that we have seen the controls on the form, let’s start and building some of the logic that is associated with it. As this control that we are going to be using will be for a native many to many (N:N) relationship, we will need to provide some configuration data that will be stored in the plugin. Since you can have multiple controls on the form itself, the configuration will allow the plugin to loop through all the controls and process them accordingly.

The configuration information will store the column name of the control, the type of relationship (in this case only native, but in future posts I will show how to do the same for a manual relationship), and the relationship name. I also have the text attribute name if we want to use a separate attribute for storing the names only of the selected records (this is for display in views).

The configuration contains a JSON array with the following key/value pairs: Attribute Name, Text Attribute Name, Relationship Type and Relationship Name as shown below:

[{“_attributeName”:”crde5_contacts”,”_textAttributeName”:”crde5_contactvalues”,”_relationshipType”:”native”,”_relationshipName”:”crde5_accounts_contacts”}]

We will store that in the Unsecure Configuration or Secure Configuration of the Plugin Step. The image below shows you how this looks within the Plugin step, but this will have to wait till the plugin is complete. The highlighted items shows the required configuration.

Multiselect Lookup - Plugin Step Unsecured Configuration

Next, let’s go ahead and build the plugin. We add two files to the plugin.

The first files contains two classes which are used for the serialization/deserialization of the JSON string from the PCF control and the JSON string for the Unsecured configuration. The LookupObject class contains the id, name and entity name that are saved by the PCF control, as shown in the code below.

    [DataContract]
    public class LookupObject
    {
        [DataMember] 
        public string _id { get; set; }
        [DataMember] 
        public string _name { get; set; }
        [DataMember]
        public string _etn { get; set; }

        public LookupObject(string id, string name, string etn)
        {
            _id = id;
            _name = name;
            _etn = etn;
        }
    }

The second class, LookupAttribute, contains the information about each attribute that is configured for use by the plugin. We use the attribute name, text attribute name, relationship type and relationship name.

The attribute name is the name of the attribute of the PCF control field. The text attribute name is the attribute of the text only concatenation of the values from the pcf control in a separate text attribute. The relationship type will contain the value of native or manual based on the type of relationship, and the relationship name contains the name of the relationship that will be used for the Associate Requests. The code below shows the Lookup Attribute class that will be used for serialization.

    [DataContract]
    public class LookupAttribute
    {
        [DataMember] 
        public string _attributeName { get; set; }
        [DataMember]
        public string _textAttributeName { get; set; }
        [DataMember] 
        public string _relationshipType { get; set; }
        [DataMember] 
        public string _relationshipName { get; set; }

        public LookupAttribute(string attributeName, string textAttributeName, string relationshipType, string relationshipName)
        {
            _attributeName = attributeName;
            _textAttributeName = textAttributeName;
            _relationshipType = relationshipType;
            _relationshipName = relationshipName;

        }

    }

Next, let’s look at the plugin code. The plugin contains a single class (so far) that is used for the retrieval of the secure and unsecure configuration strings and for the execution of the plugin code.

In this case we will be using the unsecure configuration. The code below shows the code to get the secure and unsecure configuration and store them in variables.

        string _unsecureConfigData = string.Empty;
        string _secureConfigData = string.Empty;

        public MultiSelectLookupPostCreate(string unsecureConfig, string secureConfig)
        {
            if (!string.IsNullOrEmpty(secureConfig))
                _secureConfigData = secureConfig;

            if (!string.IsNullOrEmpty(unsecureConfig))
                _unsecureConfigData = unsecureConfig;            
        }

Next we will look at the code of the Execute function. The entire code will be shared in my github repo, so I will only be going through the relevant code.

First we will create a List of type LookupAttribute which will contain all of the different attributes that this code will need to run for. If there are multiple PCF controls, the same code can be run for all of them. The code below shows how to ready the configuration data from the JSON unsecured configuration string that was initialized previously.

List lookupAttributes;
using (MemoryStream stream = new MemoryStream(Encoding.Unicode.GetBytes(_unsecureConfigData)))
{
   DataContractJsonSerializer deserializer = new DataContractJsonSerializer(typeof(List));
   lookupAttributes = (List)deserializer.ReadObject(stream);
}

Next, we will loop through each of the controls, and within each of the controls we will get the JSON data of that control, serialize it.

foreach (LookupAttribute attribute in lookupAttributes)
{
   string controlData = target.GetAttributeValue<string>(attribute._attributeName);
   using (MemoryStream dataStream = new MemoryStream(Encoding.Unicode.GetBytes(controlData)))
   {
      DataContractJsonSerializer dataDeserializer = new DataContractJsonSerializer(typeof(List));
      List lookupObjects = (List)dataDeserializer.ReadObject(dataStream);
...
   }
}

We will loop through each of the selected values in the PCF and add them to the entity reference collection so that it can be associated. The code below shows the loop.

List<string> lookupObjectNames = new List<string>();
EntityReferenceCollection relatedReferences = new EntityReferenceCollection();

foreach (LookupObject lookupObject in lookupObjects)
{
   Guid lookupObjectId = new Guid(lookupObject._id);
   relatedReferences.Add(new EntityReference(lookupObject._etn, lookupObjectId));
                                        
   lookupObjectNames.Add(lookupObject._name);
}

Finally, now that we have the collection, we will execute the Associate Request to add all of the values to the native relationship:

AssociateRequest request = new AssociateRequest();
request.Target = target.ToEntityReference();
request.Relationship = new Relationship(attribute._relationshipName);
request.RelatedEntities = relatedReferences;
AssociateResponse response = (AssociateResponse)service.Execute(request);

We can then if necessary, update the created record with the text attribute of the name values of the PCF control, shown below.

Entity update = new Entity(target.LogicalName);
update.Id = target.Id;
update.Attributes[attribute._textAttributeName] = String.Join(",", lookupObjectNames);
service.Update(update);

This is basically it. In the next blog articles, I will demonstrate how to update existing controls as well as how to create and update 1:n relationship records. A video will be coming shortly to demonstrate this functionality.

You can find the code in the following my github repository below

https://github.com/ariclevin/PowerPlatform/tree/master/PCF/MultiSelectLookup

The post Using Microsoft Multiselect Lookup in your Model Driven Apps appeared first on Aric Levin's Digital Transformation Blog.

]]>
CRUD Support for Virtual Tables https://aric.isite.dev/development/post/crud-support-virtual-tables/ Mon, 05 Apr 2021 08:45:00 +0000 https://aric.isite.dev/index.php/2021/04/05/crud-support-for-virtual-tables/ It seems like this was only a few days ago, but based on the sample that was published by Microsoft it’s been almost 2 weeks since Microsoft released CRUD support on Virtual tables.

The post CRUD Support for Virtual Tables appeared first on Aric Levin's Digital Transformation Blog.

]]>
It seems like this was only a few days ago, but based on the sample that was published by Microsoft it’s been almost 2 weeks since Microsoft released CRUD support on Virtual tables.

To tell you the truth, I have not really worked with Virtual Entities, ever since it was released, and I would have a lot of reasons to actually use them, it just never seemed the right time. In the recent release, I actually did my own little PoC to determine the complexity, and if it was always that simple, probably something to regret.

Let’s jump into it. Virtual tables (or Virtual entities) have been available for quite some time now, but only for read access. Now with the addition for Create, Update and Delete, this gives us for access to integrating between our Dataverse environment and a publicly available SQL Server (or other platform). In today’s post we will show how to go through the process of creating the SQL Server table, creating the plugin, data provider, data source and finally the virtual table an connecting all the pieces together.

All of the source code in this post is shown in images, but will be available via the Github Link at the bottom of the post.

Let’s start with the database. In this case I used an Azure SQL database, and created a table called Service Request. The script for creating the table is shown below.

Virtual Tables - Create Table in SQL Server

Next we create a Plugin Project in Visual Studio. You will need to install the following packages in your project in order to get access to all of the required resources (per Microsoft documentation):

  • Microsoft.CrmSdk.CoreAssemblies
  • Microsoft.CrmSdk.Data
  • Microsoft.CrmSdk.Deployment
  • Microsoft.CrmSdk.Workflow
  • Microsoft.CrmSdk.XrmTooling.CoreAssembly
  • Microsoft.IdentityModel.Clients.ActiveDirectory
  • Microsoft.Rest.ClientRuntime
  • Newtonsoft.Json

Within the plugin we will create 6 Classes. The first is a static class containing the Connection code, and the other classes are each for a message (Create, Retrieve, RetrieveMultiple, Update, Delete). You can create them as a single file or in multiple files. For the demonstration they are all created in separate files.

The following using statements have been added to all files, although not all is required for all classes.

Virtual Tables - Using Statements

Let’s start with the Connection class. We use the SqlConnectionStringBuilder class to generate the Connection string and return it to the rest of the classes that will consume it. This is shown in the image below.

Virtual Tables - Static Connection Class in Plugin

Next, let’s look at the CRUD classes. All of the classes implement the IPlugin interface and call the Execute function of that Interface. In each of the class based on the required functionality we retrieve the Execution Context in order to read data from the Dataverse environment, and then synchronize between our Dataverse environment and our Azure SQL Server. The screenshots below show the code for each of these.

Create Class:
Virtual Tables - Create Class

Retrieve Class:
Virtual Tables - Retrieve Class

Retrieve Multiple Class:
Virtual Tables - Retrieve Multiple Class

Update Class:
Virtual Tables - Update Class

Delete Class:
Virtual Entities - Delete Class

Once the code for the Plugin is completed we will go ahead and register the plugin via the Plugin Registration Tool. Before registering the plugin, make sure that you have the latest version of the Plugin Registration Tool that provides support for CRUD operations in virtual tables. At the time of writing this post, the required version was 9.1.0.24. You can find the download link below:

https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool

Now let’s go ahead and register the Plugin. This should include all of the five messages above. Once the Plugin is registered, we go ahead and Register a new Data Provider (using the Plugin Registration Tool).

Click on the Register button, and select Register New Data Provider (Ctrl + P).

When selecting the Data Source Entity for the first time, choose the Create New from the drop down. It is important to select the solution that contains the publisher that has the prefix that you want to use, or you will have to modify you code for something else.

Virtual Tables - Plugin Registration Tool - New Data Provider

Once we are done with creating the data provider in the Plugin Registration Tool, we can go ahead and start setting up the Virtual Table in our Dataverse environment. This functionality is not available yet in the Maker Portal, so we will have to do this in the Classic interface (Advanced Settings).

Navigate to Settings and Administration, and select the Virtual Entity Data Sources, and click on the New button. A popup will appear with a drop down to select a Data Provider as shown in the image below.

Virtual Tables - Select Data Provider (Dataverse)

Select the name of the Data Provider that you selected, and click on the OK button. Then you will be able to provide a name for the Data Source.

Click on Settings again and select Solutions. Select the name of the solution where you want to create the fields for this table. Check the Virtual Entity option, and select the Data Source that you just created. Enter a Display Name, Plural Name and Name for the entity. You will notice that there are two additional Names which are the External Name and the External Collection Name. For the External Name, you should enter the name of the Source table. The External Collections name can contain the same values as the Plural table name.

Virtual Tables - Create Virtual Table (Dataverse)

Once you have finished creating the table, go ahead and create all of the columns that you are going to use to view or update within the Virtual Entity. Only add columns that you want to either read from your Azure SQL table or that you want to write back. Any columns that are for management only are not really required. The image below show the columns that were created for our test:

Virtual Tables - Table Columns (Dataverse)

Next, we need to add this entity to our Model Driven App, so that we can test this out. Select an existing MDA or create a new one and add the entity to it.

Finally go ahead and test the results.

Virtual Tables - Demo

The animated screenshot above shows adding, reading and updating entity fields using CRUD operations.

You can click on the link below to get access to the Source Code:
https://github.com/ariclevin/VirtualTables

The post CRUD Support for Virtual Tables appeared first on Aric Levin's Digital Transformation Blog.

]]>
No-Code Solution for custom Change Log using Web Hooks and Cloud Flows https://aric.isite.dev/development/post/no-code-custom-change-log-webhook-cloud-flow/ Mon, 25 Jan 2021 00:30:00 +0000 https://aric.isite.dev/index.php/2021/01/25/no-code-solution-for-custom-change-log-using-web-hooks-and-cloud-flows/ In one of our recent requirements, we had to log changes to certain fields in a few different entities. Since we needed the value of the field before and after the change, the logical option was to use plugins and adding a pre-image step to it in order to save that data.

The post No-Code Solution for custom Change Log using Web Hooks and Cloud Flows appeared first on Aric Levin's Digital Transformation Blog.

]]>
In one of our recent requirements, we had to log changes to certain fields in a few different entities. Since we needed the value of the field before and after the change, the logical option was to use plugins and adding a pre-image step to it in order to save that data.

There are however other alternatives, that might make more sense, especially if you want to minimize the amount of code and plugins that are accessible within the system.

The solution is simple. We create a Webhook and add to it the corresponding step and Pre-Image, and upon execution it will be triggering a Cloud flow where we can use the values from the Input Parameters and the Pre-Entity Image.

Let’s go ahead and see how we implement this. The first step is to create a Cloud flow, and set the trigger to When a HTTP request is received.

Dataverse Web Hook - Cloud Flows - HTTP Request is Received

The HTTP POST URL is not available until the flow is saved, so enter the request Body JSON Schema. You will need to enter an action to Save the Flow, so you can enter an Initialize Variable, which we will use later set a name for the flow and Save the record. Once the record is saved, the HTTP POST URL will be filled in and you will be able to configure the Webhook. Copy the HTTP POST URL and paste it in NotePad or other text editor. The result should be similar to what you see below:

https://prod-XXX.westus.logic.azure.com:443/workflows/41e03b6243cc47273a71f827fb3bd29b/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=e5138Klq7cOcbG1RJ2bXA42et4vFM0-kZ3k8jyK7Txs

Add new lines between the url and the query string parameters and remove the separators so that it looks like this:

https://prod-XXX.westus.logic.azure.com:443/workflows/41e03b6243cc47273a71f827fb3bd29b/triggers/manual/paths/invoke

api-version=2016-06-01

sp=%2Ftriggers%2Fmanual%2Frun

sv=1.0

sig=e5138Klq7cOcbG1RJ2bXA42et4vFM0-kZ3k8jyK7Txs

Replace all the %2F encodings with the corresponding character (/). This will change that line to look like this: sp=/triggers/manual/run

Now, let’s go and open the Plugin Registration Tool, and click on the Register drop down and select Register New Web Hook, or CTRL+W for short.

Dataverse Web Hook - Cloud Flows - Register Web Hook

This will open the Register New Web Hook dialog when we will enter the HTTP URL and the different parameters that we specified above as shown in the screenshot below.

Dataverse Web Hook - Cloud Flows - Web Hook Registration

Now that we have registered the Web Hook, you should be able to see the new Web Hook in the list of Registered Plugins. Select the Web Hook, and select Register New Step (either by Right Clicking on the Web Hook and selecting Register New Step or from the top tab menu).

Just as you would perform this step registration process for a plugin, do the same for the Web Hook. Select the Message, Primary Entity and Filtering Attributes as shown in the image below. Of course you should customize this for your own message..

Dataverse Web Hook - Cloud Flows - Register New Step

Since we want to get the value before the change and after the change, we need to register a new image for the step of type Pre Image, as shown in the screenshot below. Add the name and entity alias, and specify the parameters that you want for you Pre Image.

Dataverse Web Hook - Cloud Flows - Register Pre Image

Now that we are done with the configuration within the Plugin Registration Tool, let’s take a look at where this data will be stored. We created a new entity called Change Log that contains information about the entity and the record that was modified, along with fields to store the values before and after the change.

Dataverse Web Hook - Cloud Flows - Change Log Table

Next, let’s go back the flow that we created. We will start by initializing a few variables that will be used to store some of the values, and then retrieve the values from the HTTP Post method.

It is a good idea to run the flow once so that you can review the Outputs of the When a HTTP request is received trigger, and that will help you with constructing the expressions needed for the different steps.

Dataverse Web Hook - Cloud Flows - HTTP Request received outputs

The variables are shown in the image below. The Pre and Post variables will be added later, while the Entity Id, Entity Name and User Id can be filled out during the initialization as follows:

  • triggerBody()?[‘PrimaryEntityId’]
  • triggerBody()?[‘PrimaryEntityName’]
  • triggerBody()?[‘UserId’]

Dataverse Web Hook - Cloud Flows - Initialize Variables

Since getting the values from the Input Parameters and the PreImage is a little more complex, I used two compose actions for these, and set the Inputs to as follows:

  • triggerBody()?[‘InputParameters’][0][‘value’][‘Attributes’]
  • triggerBody()?[‘PreEntityImages’][0][‘value’][‘Attributes’]

Dataverse Web Hook - Cloud Flows - Compose Data Operations

The final step before storing the record is doing an Apply to each of the Input parameters and check the value corresponding to the attribute that was changed. We run this for both the PreEntity image results and the Input Parameter results and have a condition to check for the Input Parameter.

The Condition logic expression is: items(‘Apply_to_each_PreEntity_Image’)?[‘key’], and the Set Variable Pre contains the following expression: items(‘Apply_to_each_PreEntity_Image’)?[‘value’]

Dataverse Web Hook - Cloud Flows - Apply to Each, Conditions and Set Variables

The same applies to the other apply to each action. Once we have retrieved all the information that we need, the change log record will be created. 

Dataverse Web Hook - Cloud Flows - Create Change Log Record

The results will look like this:

Dataverse Web Hook - Cloud Flows - Change Log Record Results

The post No-Code Solution for custom Change Log using Web Hooks and Cloud Flows appeared first on Aric Levin's Digital Transformation Blog.

]]>
Configure Azure Service Bus to integrate between CDS and On-Premise SQL database https://aric.isite.dev/azure/post/configure-asb-integrate-cds-sql/ Mon, 06 Jan 2020 07:17:00 +0000 https://aric.isite.dev/index.php/2020/01/06/configure-azure-service-bus-to-integrate-between-cds-and-on-premise-sql-database/ In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

There are various other ways to implement this, with the combination of Azure products such as Azure functions, Azure Logic Apps, On-Premise Data Gateway and Microsoft Flow, but those are not always available, especially when working in non-commercial environments, or when IT restricts what resources are available to the different departments of the agencies.

In order to implement this, there are a few prerequisites that have to be completed. These are set up the database server, write the console application to act as the event listener, install the Azure Service Bus, create plugin code to call the Azure Service Bus when a particular event happens. The logic of this article will be as follows: When an account record gets created or updated, it will call the Azure Service Bus in order to update an On-Premise SQL database that a new record is created or an existing record got updated.

Configuring the SQL database

We will start with the first step which is the configuration of the database. You can set up the database with a single table (to use only for Accounts), or add some related tables if necessary. We will also create a couple of stored procedures to Insert and Update the Accounts table. Links to code files will be shared at the end of the post. The image below displays the database structure. For the purpose of this articles, the Contacts and Users tables will not be required.

Azure Staging SQL Database with Tables and Stored Procedures

Create the Azure Service Bus Namespace

We can now create the Azure Service Bus. Login to your Azure Portal and search for Service Bus under the list of available Resources. Click on the Add button to add a new service bus.

Add New Azure Service Bus Namespace

This will open the create namespace window. Enter a name for your service bus in the Create Namespace window. It will append servicebus.windows.net to the name that you specify. Select a pricing tier, your subscription, resource group and location where you want this hosted. If you do not have a resource group, you can create a new one from this window.

Azure Service Bus Create Namespace

It will take a couple of minutes (or less) to create your namespace, and then the Azure Service Bus will be visible on your screen. The first thing that we need to do is check the Shared access policy. There is a default Shared Access policy that is available with the new Azure Service Bus that was created, and includes the three available claims: Manage, Send, Listen.

Send we only need Send and Listen, we will create a new Shared access policy, naming it IntegrationSharedAccessKey (or any other name that you would like) and set the available claims to Send and Listen. After you create your shared access policy, click on it to see the Keys and the Connection Strings. You will need them for configuring CDS and your Console application.

Configure the Service Endpoint in CDS

Next, let’s add this to out CDS environment, by running the Plugin Registration Tool. We will be using the Plugin Registration Tool for version 9.1. Run the Plugin Registration Tool and Create a new connection to your CDS environment. Click on Register and Select a new Service Endpoint. Copy the Connection String from your Service Bug Resource in Azure, and paste it in the popup window. Make sure that the Let’s Start with the connection string from the Azure Service Bus Portal… option is selected (as shown below).

Add Service Bus Endpoint Connection String

In the next window, Service Endpoint Registration make the following changes:

  • In the namespace address, change sb to https
  • In the Designation Type change to Two Way (you can also do One Way if you prefer, but in this particular examples we are using two-way.
  • Enter a path for the Service Endpoint. This will be used by the console application to listen to where changes are being made.
  • The Authorization Type should be set to SASKey, with the SASKeyName as the name of your Shared Access policy in Azure, and the SAS Key copied from the Primary Key in the Shared Access policy.

Azure Service Bus Service Endpoint Registration

After we have created the Service Endpoint, we need to capture the Guid of the Service Endpoint as we will use it from the plugin that we will soon start developing. In order to get the Guid of the Service Endpoint, click on the Properties tab, and scroll down till you see ServiceEndpointId property. Copy the Guid from there. We can hard code this in our application, or add it as Environmental Variable inside of CDS.

Add Environmental Variable in CDS

Navigate to a solution inside of Power Apps, select the solution and click on the New button on the command bar. Your will see the option to add new environmental variable. Enter a display name, schema name (automatically populated), description and data type. As this is going to store a Guid, you should use Text as the data type. You can enter a default value for your new EndpointId or specify a new value. The screenshot below shows how this is added.

Common Data Service Environmental Variable

As the Environmental Variables are still a new feature and there are some issues that have to be dealt with, you can use the Portal Settings entity or another Configuration entity to store your variables.

Writing the Plugin Code

We can now start developing the plugin code. This is a simple plugin that will run on the Account Create and Account Update messages. Our Plugin includes three code files: Plugin.cs, Account.cs and AccountLogic.cs.

The Plugin.cs is the standard tool generated Plugin.cs file that implements the IPlugin interface. There are only a couple of changes that were done to this class, since we need to communicate with Azure Service Bus.

We added an internal property called CloudService of type IServiceEndpointNotificationService. In the LocalPluginContext constructor, we set the value for the CloudService property.

Add Property for Cloud Service

Get Service for Azure Service Bus Listener

The Account Class adds the Create and Update events to the RegisteredEvents collection, and adds two functions: ExecutePostAccountCreate and ExecutePostAccountUpdate which get the Account Id and call the CreateAccount or UpdateAccount of the logic class.

Account Entity Plugin Logic

The AccountLogic class has a construction that takes four parameters: the Plugin Execution Context, Organization Service, Tracing Service and the Service Endpoint Notification Service. These are all used for the different purposes of the plugin.

Both the CreateAccount and UpdateAccount functions have the exact same logic. The difference is in the message which is going to be passed to the Azure Service Bus as part of the context.

The CreateUpdateAccount function Retrieves all the data from the Account record, gets the Endpoint Id (from the Enviromental Variables Entities), adds the Account record to the Shared Variables of the Context and finally calls the Azure Service Bus passing the context, which includes the data from the Account entity.

Create and Update Account

After the plugin receives the response, it writes the response to a note within the account record. After the plugin is complete, make sure to sign and compile it, and then register it with the plugin registration tool. When the assembly is registered, add two messages (one for Create and one for Update of the Account entity).

Plugin Registration Tool Account Plugin

Writing the Listener Windows Service Application
The final step is to work on the Console Application. We can start by writing the helper class that will connect to SQL Server and Create a new Account record. The first thing that we need to do is add a connection string to the App.Config that is part of the Console Application. If an App.Config file does not exist, you can create one.

In the Connection String section enter the following code:

<add name=”AzureStaging” connectionString=”Data Source={LOCALDBSERVERNAME};Initial Catalog=AzureStaging;User ID=azureadmin;Password=Azur3Adm1n;Persist Security Info=False” />

We create a SqlDataManager class which will call the SQL server stored procedures to Create and Update the account records. The functions will receive the values of the different fields in SQL Server and add them as Stored Procedure parameters as shown below:

Listener Create Account (SQL Data Manager)

Next we create the class to listen and process the incoming requests from Azure Service Bus. The class has a default public function called Execute which accepts a RemoteExecutionContext parameter. This parameters contains the Shared Variables that we passed from our plugin as well as the execution context that allows us to retrieve the message name to know if this is a create or an update.

Azure Service Bus Listener Entry Point

The CreateAccount and UpdateAccount functions receive the Account entity, take the necessary attributes and call the CreateAccount function of the SqlDataManager class in order to store the data in SQL Server.

Azure Service Bus Listener Create Account

We added the Service class to the project, which contains an eventlog component that will write errors to the Event Log. The OnStart Event will listen to events and the OnStop Event will stop listening to the Azure Service Bus.

Azure Service Bus Windows Service Start/Stop

The Console application will run as a Windows Service. The solution includes the Project Installer class which allows us to install this as a Windows Service class. The Main entry point of the application provides us with some testing capabilities as well as installation or uninstallation of the Windows Service.

We added a Program class that allows us to install and uninstall the Windows Service from within our Visual Studio debugger.

Install or Uninstall Windows Service from Visual Studio Debugger

Once the Windows Service is installed by running the VS application with the –install parameter, you will see the results in Services. Verify the Service is running, and depending whether you are testing this on your Local machine or on a Server, determine if to use the Local System account or the Network Service account.

Azure Service Bus Installed Windows Service

Now that we have done installing all the components, it is time to test all the parts working together. We tried running the first test, and the test did not have all the required fields. The description field was missing, so the result was a new note added to the timeline returning the error of a missing field from SQL Server.

Azure Service Bus Integration (End to End - First Try)

We then tried again, providing the Description field as well. This time we saw the success message in the timeline.

Azure Service Bus Integration (End to End - Second Try)

We wanted to verify that the data existed in SQL Server, so we ran a SQL Statement to validate the data, and received the following results:
New Record created in SQL Server. Results successful

Hope this will be a benefit for you. All the Plugin Source, Database Source and Listener Console Application Source Code is available on Github.
https://github.com/ariclevin/Integration/tree/master/ASBListener

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
Global Cloning functionality for Dynamics 365 https://aric.isite.dev/dynamics/post/global-cloning-for-dynamics-365/ Thu, 15 Feb 2018 05:28:00 +0000 https://aric.isite.dev/index.php/2018/02/15/global-cloning-functionality-for-dynamics-365/ Recently I had a requirement to provide cloning capabilities for one of the projects that I was working on. It wasn't so simple as to just clone an individual record, but also provide the ability to clone the relationships.

The post Global Cloning functionality for Dynamics 365 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently I had a requirement to provide cloning capabilities for one of the projects that I was working on. It wasn’t so simple as to just clone an individual record, but also provide the ability to clone the relationships.

This is where it becomes tricky, as some entities cannot be easily cloned due to some of the restrictions, so we wanted to provide this solution the ability to restrict certain actions from happening. For example, the address1_addressid and address2_addressid fields in the Account and Contact entities cannot be cloned as they point to the Customer Address record, so the Guid there has to be unique.

So for the first scenario, we needed to provide the system with the ability to restrict certain fields from being cloned, by providing a status of Active or Inactive, the cloning solution will decide whether or not to clone the record. The screenshot below shows the Clone Settings entity, with the list of attributes that are available for cloning. Notice the highlighted row is marked as Inactive., which means it will not be cloned.

Clone Settings Entity

Next we had to deal with relationships. One of the issue with relationships is that there are probably many relationships that you do not want to be cloned. In the related entities, we provided 3 statuses: Duplicate, Reassociate or Inactive. The Inactive option skips the cloning procedure for the relationship. The Duplicate will make a duplicate of the record Related entity record and the Reassociate will reassociate the related lookup from the source record to the cloned record.

Clone Relationships

We modifies the application ribbon so that the Clone button will appear on every entity (based on a webapi call to check if the entity is enabled for cloning in the Clone Settings), and added a global script library for this purpose on for calling the Clone action. The end result is as follows.

Original and Cloned records

This solution works for our purpose, but I would consider it a Beta Release for anyone who is interested in source code. It’s available on github, so you can make whatever changes that you want. I will add instructions on how to use sometime soon.

The post Global Cloning functionality for Dynamics 365 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Create Email Message with Attachment from a Note https://aric.isite.dev/dynamics/post/create-email-message-with-attachments/ Mon, 25 Dec 2017 18:59:00 +0000 https://aric.isite.dev/index.php/2017/12/25/create-email-message-with-attachment-from-a-note/ create-email-message-with-attachments

The post Create Email Message with Attachment from a Note appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently I saw a request of how to create an email message with an attachment that exists from a Note, and how to do this via a Plugin. I remembered doing something like that in an old project, so I thought that I would share the logic behind this.

The first thing is to decide how this will be called. This logic can be called from an Action or Plugin, but the logic will reside in backend code. How to initiate the process is up to you. Once we initiate the process, the first thing to do is to retrieve the notes from the existing entity that contains the note documents. In order to retrieve the notes, we need to implement the following logic:

        private EntityCollection RetrieveAnnotations(string entityName, Guid entityId)
        {
            QueryExpression query = new QueryExpression(Annotation.EntityLogicalName)
            {
                ColumnSet = new ColumnSet(true),
                Criteria =
                {
                    Conditions =
                    {
                        new ConditionExpression("objectid", ConditionOperator.Equal, entityId),
                        new ConditionExpression("objecttypecode", ConditionOperator.Equal, entityId)
                    }
                }
            };

            EntityCollection results = service.RetrieveMultiple(query);
            return results;
        }

 This logic will retrieve all of the annotations related to a particular entity record. I did not add another condition for IsDocument, but you can add it if required. I will show that in the end. The next step is to create to more functions. The first function is to create an email message, and the second in to add the attachment to the email message that I created. Let’s take a look at each one of these functions separately.

The Create Email Message function receives 4 Parameters: From (Type Entity), To (Type Entity), Subject (Type String) and Email Body (Type String), and returns the Guid of the Email Message that was created. The source is shown below, and can be modified to fit your exact needs:

        private Guid CreateEmailMessage(Entity from, Entity to, string subject, string description)
        {
            Guid emailid = Guid.Empty;
            Entity email = new Entity("email");

            EntityCollection fromParty = new EntityCollection() { EntityName = "activityparty" };
            fromParty.Entities.Add(from);
            email.Attributes["from"] = fromParty;

            EntityCollection toParty = new EntityCollection() { EntityName = "activityparty" };
            toParty.Entities.Add(to);
            email.Attributes["to"] = toParty;

            email.Attributes["subject"] = subject;
            email.Attributes["description"] = description;

            try
            {
                emailid = service.Create(email);
            }
            catch (FaultException<OrganizationServiceFault> ex)
            {
                string message = "An error occurred in the CreateEmailMessage function";
                throw new InvalidPluginExecutionException(message);
            }
            return emailid;
        }

Once we create the email message we can add the attachment to the message. The information for creating the attachment is retrieved from the notes, so there is no real changes required. We will see how everything fits together at the end.

        private Guid AddAttachmentsToEmail(Guid emailId, string fileName, string documentBody, string mimeType)
        {
            Entity attachment = new Entity("activitymimeattachment");
            attachment["subject"] = "Attachment to Email";
            attachment["filename"] = fileName;
            attachment["mimetype"] = mimeType;
            attachment["body"] = documentBody;

            attachment["objectid"] = new Entity("email", emailId);
            attachment["objecttypecode"] = "email";

            try
            {
                Guid attachmentId = service.Create(attachment);
                return attachmentId;
            }
            catch (FaultException<OrganizationServiceFault> ex)
            {
                string message = "An error occurred in the AddAttachmentsToEmail function";
                throw new InvalidPluginExecutionException(message);
            }
        }

 Now that the Email message is created, and the attachment is added to the email message, we need to just Send the Email Message. The function only needs the email Id of the message, but can be modified as needed:

        private void SendEmail(Guid emailId)
        {
            SendEmailRequest request = new SendEmailRequest();
            request.EmailId = emailId;
            request.TrackingToken = "";
            request.IssueSend = true;

            try
            {
                SendEmailResponse response = (SendEmailResponse)service.Execute(request);
            }
            catch (FaultException<OrganizationServiceFault> ex)
            {
                string message = "An error occurred in the SendEmail function.");
                throw new InvalidPluginExecutionException(message);
            }
        }

Finally, we can put everything together in the CreateEmailLogic function which will retrieve the notes by calling the RetrieveAnnotations function, loop through the collection of the notes, check if they are a document, create the email message, add attachments and send the email. This is the way the entry point function looks like:

        private void CreateEmailLogic(string customEntityName, Guid entityRecordId)
        {
            EntityCollection notes = RetrieveAnnotations(customEntityName, entityRecordId);
            if (notes.Entities.Count > 0)
            {
                foreach (Entity note in notes.Entities)
                {
                    bool isDocument = Convert.ToBoolean(note["isdocument"]);

                    if (isDocument)
                    {
                        // Should check these attributes exist in Note
                        string documentBody = note["documentbody"].ToString();
                        string mimeType = note["mimetype"].ToString();
                        string fileName = note["filename"].ToString();


                        // Need to get information to generate email message
                        Guid emailId = CreateEmailMessage(from, to, subject, emailMessage);
                        AddAttachmentsToEmail(emailId, fileName, documentBody, mimeType);
                        SendEmail(emailId);
                    }
                }
            }
        }

There are many possible variations to  the above logic and functions, but with the above you can accomodate the requirements that you need. 

The post Create Email Message with Attachment from a Note appeared first on Aric Levin's Digital Transformation Blog.

]]>
Cloning a Record in Dynamics CRM https://aric.isite.dev/dynamics/post/clone-record-in-dynamics-crm/ Sun, 22 Oct 2017 04:22:00 +0000 https://aric.isite.dev/index.php/2017/10/22/cloning-a-record-in-dynamics-crm/ Recenly we received requests from clients and some questions from Dynamics Community members on how to Clone records. Although there are some available solutions out there, and the various possibilities on how to implement this, we would like to demonstrate here one possibly and not to complicated way on how to implement this. This implementation involved using Ribbon Workbench to create the Clone button and a Command that will execute a JavaScript function, which will call an action and execute Plugin/Action code to copy the record.

The post Cloning a Record in Dynamics CRM appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recenly we received requests from clients and some questions from Dynamics Community members on how to Clone records. Although there are some available solutions out there, and the various possibilities on how to implement this, we would like to demonstrate here one possibly and not to complicated way on how to implement this. This implementation involved using Ribbon Workbench to create the Clone button and a Command that will execute a JavaScript function, which will call an action and execute Plugin/Action code to copy the record.

So the first step of course is using Ribbon Workbench to create the button, enable/display rules and the command. We start of by adding our entity that we want to customize to a new Unmanaged solution. We only need to add the entity, and not all of the components of the entity. Once we created the solution we open Ribbon Workbench, and select the new solution that we created.

We can not add a new button with an image to the form command bar, as shown in the picture below:

Add Button to Form Command Bar

You should also add the Label, Alt, Tool Tip Text and Tool Tip Description. It is a good habit to have the Tooltip Description display a different text that the Tool Tip Text. We can now create the display rule. We created a simple display rule only with a FormStateRule that has a State of Existing. This means that the Clone Command will only be available for records that already exist in the System, but not for newly or disabled records. You can change the display rule, as you wish.

Once we have the display rule, we will create the command. You should have a JavaScript Library in place for your entity that you wish to clone ahead of time, so that you can use it for the Command. You can also use a Global library if you are planning to use the same logic for multiple entities. In the command we will Create a Custom JavaScript action and specify the name of the library and function that will be used when the Clone Command Bar button is clicked (shown is the image below).

Clone Record Command

Let’s go ahead and look at the finalized button properties before publishing the solution changes.

Clone Button

We can now click on the Publish button in Ribbon Workbench to finish adding the button to our form. The next step is to add the JavaScript code to our library that we previously created. There are a few ways of calling actions from JavaScript, but we prefer to use the process.js JavaScript Library. This library can be downloaded from github, and is available here: https://github.com/PaulNieuwelaar/processjs. Add this library to your form where you added the Clone button to, so that you can use it in your JavaScript Library.

function cloneGlobalSetting()
{
    var entityId = Xrm.Page.data.entity.getId();
    var entityName = "new_globalsetting";
    var messageName = "new_CloneGlobalSetting";
    var success = callCloneAction(entityId, entityName, messageName);
    if (success) {
        Xrm.Page.ui.setFormNotification("Record has submitted for cloning.", "INFO", "GEN")
        setTimeout(function () {
            reloadPage(entityId);
        }, 3000);
    }
}

The above function calls the callCloneAction, and passes the Guid of the entity record, the name of the entity and the name of the SDK message that will be created. The below code snippets show the callCloneAction and callProcessAction functions. Although these can be combined into one, we separated them, since we have multiple function calls to Actions in our code, and use the callProcessAction for multiple purposes. If you only have one call to callProcessAction, you can keep it separate, or add it to your global script library to be used across multiple entities.

function callCloneAction(entityId, entityName, messageName, name) {

    var inputParams =
        [
            {
                key: "Target",
                type: Process.Type.EntityReference,
                value: new Process.EntityReference(entityName, entityId)
            }
        ];
    var success = callProcessAction(entityId, entityName, messageName, inputParams);
    return success;
}

function callProcessAction(entityId, entityName, messageName, inputParams) {
    var success = true;
    Process.callAction(messageName, inputParams,
    function (params) {
        // Success
        var result = "";
        for (var i = 0; i < params.length; i++) {
            result = params[i].key + "=" + params[i].value + ";";
        }
        // alert(result);
        success = true;
    },
    function (e, t) {
        // Error
        success = false;
        alert(e);

        if (window.console && console.error)
            console.error(e + "n" + t);
    }
    );
    return success;
}

Now that we have added our JavaScript code to our library, we need to upload and publish this file, as well as add it to the form. Once that is done, we need to create the Action Process in our solution. We do this by navigating to Settings -> Processes and Creating a New Action. The screenshot below shows the Action Process that we created. Note that there is a parameter called Action Type, which is not required for this particular case, but is used for our cloning process, since the record can be cloned in a few different ways.

Create Action Process

We now have to add the Plugin/Action code and register it. The first thing to do is create a new plugin. If this is the first time you are creating a new plugin, please follow the Microsoft MSDN article on how to create a Basic Plugin here. In our Plugin project, we created two classes, although this can be done with one. The first class shown below is just the entry point which calls the actual class that processes the action.

    public class GlobalSetting : Plugin
    {
        public GlobalSetting()
            : base(typeof(GlobalSetting))
        {
            this.RegisteredEvents.Add(new Tuple<int, string, string, Action<LocalPluginContext>>(40, "new_CloneGlobalSetting", "new_globalsetting", new Action<LocalPluginContext>(ExecutePostGlobalSettingClone)));
        }

        protected void ExecutePostGlobalSettingClone(LocalPluginContext localContext)
        {
            if (localContext == null)
            {
                throw new ArgumentNullException("localContext");
            }
            string entityName = localContext.PluginExecutionContext.PrimaryEntityName;
            Guid entityId = localContext.PluginExecutionContext.PrimaryEntityId;
            string actionType = localContext.PluginExecutionContext.InputParameters["ActionType"].ToString();


            ITracingService tracingService = localContext.TracingService;
            tracingService.Trace("Entered {0} Plugin Method", "ExecutePostGlobalSettingClone");

            using (GlobalSettingLogic logic = new GlobalSettingLogic(localContext.OrganizationService, localContext.TracingService))
            {
                logic.CloneGlobalSetting(entityName, entityId, actionType);
            }
        }
    }

In our second class (GlobalSettingLogic), which is only a class that hold the logic of the requirement we will add the following function:

        private Guid CloneGlobalSetting(Entity globalSetting)
        {
            Entity newGlobalSetting = new Entity(new_GlobalSetting.EntityLogicalName);

            foreach (KeyValuePair<String, Object> attribute in globalSetting.Attributes)
            {
                string attributeName = attribute.Key;
                object attributeValue = attribute.Value;

                switch (attributeName.ToLower())
                {
                    case "new_globalsettingid":
                        break;
                    case "new_name":
                        newGlobalSetting[attributeName] = attributeValue + " - Cloned";
                        break;
                    default:
                        newGlobalSetting[attributeName] = attributeValue;
                        break;
                }

            }

            try
            {
                Guid globalSettingId = service.Create(newGlobalSetting);
                return globalSettingId;
            }
            catch (FaultException<OrganizationServiceFault> ex)
            {
                throw new InvalidPluginExecutionException("An error occurred in the CloneGlobalSetting function of the plug-in.", ex);
            }
        }

We now how to Build our Plugin, and deploy it using the Plugin Registration Tool (as shown in the following screenshot). The Action Process must have been completed prior to this step in order for it to appear as a Message in the Plugin Registration Tool.

Register Action

Once the Plugin is registered, you are basically done. You can now test it out, and go to your form, click on the Clone button and a newly Cloned record can be created. The final form will look like this with the Clone button:

Clone Button on Form

The post Cloning a Record in Dynamics CRM appeared first on Aric Levin's Digital Transformation Blog.

]]>
Publishing Plugin Steps from Code https://aric.isite.dev/development/post/publishing-plugin-steps-from-code/ Wed, 18 May 2016 17:47:00 +0000 https://aric.isite.dev/index.php/2016/05/18/publishing-plugin-steps-from-code/ Sometimes you write applications that contains some logic, that require you to create steps for your plugin based on other entities. For example, if you have an AutoNumber solution, you can create a plugin step for a particular entity every time you create a new AutoNumber record.

The post Publishing Plugin Steps from Code appeared first on Aric Levin's Digital Transformation Blog.

]]>
Sometimes you write applications that contains some logic, that require you to create steps for your plugin based on other entities. For example, if you have an AutoNumber solution, you can create a plugin step for a particular entity every time you create a new AutoNumber record.

This process will allow your AutoNumber functionality to be immediately available when the user creates a new AutoNumber Settings record.

In order to do this, you will need to either retrieve or hard code the name of your assembly and plugin type as shown below.

const string ASSEMBLY_NAME = "Dyn365.Xrm.Plugins.AutoNumber";
const string PLUGIN_TYPE_NAME = "Dyn365.Xrm.Plugins.AutoNumber.PluginEntryPoint";

We then need to get the Guid of the message that we are adding, as well as get the Object Type Code and check if this message already exists, so that we bypass this logic if there is already a step for this message.

public void PublishSDKMessageProcessingStep(string entityName, Guid impersonatingUserId)
{
   Guid sdkMessageId = GetSdkMessageId("Create");
   int? objectTypeCode =  RetrieveEntityMetadataObjectTypeCode(entityName.ToLower());
   if (objectTypeCode.HasValue)
   {
      bool stepExists = RetrieveSdkMessageProcessingStep(sdkMessageId, objectTypeCode.Value);
      if (!stepExists)
         CreateSdkMessageProcessingStep(string.Format("{0}: Create of {1}", PLUGIN_TYPE_NAME, entityName), entityName.ToLower(), "", StepMode.Sync, 10, PluginStage.PostOperation, StepInvocationSource.Parent, impersonatingUserId);
   }
}

private int? RetrieveEntityMetadataObjectTypeCode(string entityName)
{
    RetrieveEntityRequest request = new RetrieveEntityRequest()
    {
        EntityFilters = EntityFilters.Entity,
        LogicalName = entityName
    };

    RetrieveEntityResponse response = (RetrieveEntityResponse)service.Execute(request);
    int? result = response.EntityMetadata.ObjectTypeCode;
    return result;
}

The RetrieveSdkMessageProcessingStep is a queryexpression on the SdkMessageProcessingStep with multiple linked entities (SdkMessageFilter and PluginType) to check if the record already exists. It is probably easiest to do this as a FetchXml query, but works both ways. The CreateSdkMessageProcessingStep function was called after the retrieval of the required values. This method will create the SDKMessageProcessingStep record. We will need to get the Ids of some of the Plugin types in order to complete the save operation

private Guid CreateSdkMessageProcessingStep(string name, string entityName, string configuration, StepMode mode, int rank, PluginStage stage, StepInvocationSource invocationSource, Guid impersonatingUserId)
{
    Entity step = new Entity("sdkmessageprocessingstep");
    step["name"] = name;
    step["description"] = name;
    step["configuration"] = configuration;
    step["mode"] = new OptionSetValue(mode.ToInt());
    step["rank"] = rank;
    step["stage"] = new OptionSetValue(stage.ToInt());
    step["supporteddeployment"] = new OptionSetValue(0); // Server Only
    step["invocationsource"] = new OptionSetValue(invocationSource.ToInt());

    Guid sdkMessageId = GetSdkMessageId("Create");
    Guid sdkMessageFilterId = GetSdkMessageFilterId(entityName, sdkMessageId);

    Guid assemblyId = GetPluginAssemblyId(ASSEMBLY_NAME);
    Guid pluginTypeId = GetPluginTypeId(assemblyId, PLUGIN_TYPE_NAME);

    step["plugintypeid"] = new EntityReference("plugintype", pluginTypeId);
    step["sdkmessageid"] = new EntityReference("sdkmessage", sdkMessageId);
    step["sdkmessagefilterid"] = new EntityReference("sdkmessagefilter", sdkMessageFilterId);

    if (impersonatingUserId != Guid.Empty)
        step["impersonatinguserid"] = new EntityReference("systemuser", impersonatingUserId);

    try
    {
        Guid stepId = service.Create(step);
        return stepId;
    }
    catch (InvalidPluginExecutionException invalidPluginExecutionException)
    {
        throw invalidPluginExecutionException;
    }
    catch (Exception exception)
    {
        throw exception;
    }
}

private Guid GetSdkMessageId(string SdkMessageName)
{
    try
    {
        //GET SDK MESSAGE QUERY
        QueryExpression sdkMessageQueryExpression = new QueryExpression("sdkmessage");
        sdkMessageQueryExpression.ColumnSet = new ColumnSet("sdkmessageid");
        sdkMessageQueryExpression.Criteria = new FilterExpression
        {
            Conditions =
                {
                    new ConditionExpression
                    {
                        AttributeName = "name",
                        Operator = ConditionOperator.Equal,
                        Values = {SdkMessageName}
                    },
                }
        };

        //RETRIEVE SDK MESSAGE
        EntityCollection sdkMessages = service.RetrieveMultiple(sdkMessageQueryExpression);
        if (sdkMessages.Entities.Count != 0)
        {
            return sdkMessages.Entities.First().Id;
        }
        throw new Exception(String.Format("SDK MessageName {0} was not found.", SdkMessageName));
    }
    catch (InvalidPluginExecutionException invalidPluginExecutionException)
    {
        throw invalidPluginExecutionException;
    }
    catch (Exception exception)
    {
        throw exception;
    }
}

private Guid GetSdkMessageFilterId(string EntityLogicalName, Guid sdkMessageId)
{
    try
    {
        //GET SDK MESSAGE FILTER QUERY
        QueryExpression sdkMessageFilterQueryExpression = new QueryExpression("sdkmessagefilter");
        sdkMessageFilterQueryExpression.ColumnSet = new ColumnSet("sdkmessagefilterid");
        sdkMessageFilterQueryExpression.Criteria = new FilterExpression
        {
            Conditions =
                {
                    new ConditionExpression
                    {
                        AttributeName = "primaryobjecttypecode",
                        Operator = ConditionOperator.Equal,
                        Values = {EntityLogicalName}
                    },
                    new ConditionExpression
                    {
                        AttributeName = "sdkmessageid",
                        Operator = ConditionOperator.Equal,
                        Values = {sdkMessageId}
                    },
                }
        };

        //RETRIEVE SDK MESSAGE FILTER
        EntityCollection sdkMessageFilters = service.RetrieveMultiple(sdkMessageFilterQueryExpression);

        if (sdkMessageFilters.Entities.Count != 0)
        {
            return sdkMessageFilters.Entities.First().Id;
        }
        throw new Exception(String.Format("SDK Message Filter for {0} was not found.", EntityLogicalName));
    }
    catch (InvalidPluginExecutionException invalidPluginExecutionException)
    {
        throw invalidPluginExecutionException;
    }
    catch (Exception exception)
    {
        throw exception;
    }
}

private Guid GetPluginAssemblyId(string assemblyName)
{
    try
    {
        //GET ASSEMBLY QUERY
        QueryExpression pluginAssemblyQueryExpression = new QueryExpression("pluginassembly");
        pluginAssemblyQueryExpression.ColumnSet = new ColumnSet("pluginassemblyid");
        pluginAssemblyQueryExpression.Criteria = new FilterExpression
        {
            Conditions =
                {
                    new ConditionExpression
                    {
                        AttributeName = "name",
                        Operator = ConditionOperator.Equal,
                        Values = {assemblyName}
                    },
                }
        };

        //RETRIEVE ASSEMBLY
        EntityCollection pluginAssemblies = service.RetrieveMultiple(pluginAssemblyQueryExpression);
        Guid assemblyId = Guid.Empty;
        if (pluginAssemblies.Entities.Count != 0)
        {
            //ASSIGN ASSEMBLY ID TO VARIABLE
            assemblyId = pluginAssemblies.Entities.First().Id;
        }
        return assemblyId;
    }
    catch (InvalidPluginExecutionException invalidPluginExecutionException)
    {
        throw invalidPluginExecutionException;
    }
    catch (Exception exception)
    {
        throw exception;
    }
}

private Guid GetPluginTypeId(Guid assemblyId, string PluginTypeName)
{
    try
    {
            //GET PLUGIN TYPES WITHIN ASSEMBLY
            QueryExpression pluginTypeQueryExpression = new QueryExpression("plugintype");
            pluginTypeQueryExpression.ColumnSet = new ColumnSet("plugintypeid");
            pluginTypeQueryExpression.Criteria = new FilterExpression
            {
                Conditions =
                {
                    new ConditionExpression
                    {
                        AttributeName = "pluginassemblyid",
                        Operator = ConditionOperator.Equal,
                        Values = {assemblyId}
                    },
                    new ConditionExpression
                    {
                        AttributeName = "typename",
                        Operator = ConditionOperator.Equal,
                        Values = {PluginTypeName}
                    },
                }
            };

            //RETRIEVE PLUGIN TYPES IN ASSEMBLY
            EntityCollection pluginTypes = service.RetrieveMultiple(pluginTypeQueryExpression);

            //RETURN PLUGIN TYPE ID
            if (pluginTypes.Entities.Count != 0)
            {
                return pluginTypes.Entities.First().Id;
            }
            throw new Exception(String.Format("Plugin Type {0} was not found in Assembly", PluginTypeName));
    }
    catch (InvalidPluginExecutionException invalidPluginExecutionException)
    {
        throw invalidPluginExecutionException;
    }
    catch (Exception exception)
    {
        throw exception;
    }
}

As you can see, the process is pretty simple, but requires a lot of trips to get the unique identifier values of the Assembly, Plugin Type, Plugin and Message Id, and Message Filter Id.

The post Publishing Plugin Steps from Code appeared first on Aric Levin's Digital Transformation Blog.

]]>
Creating an Approval Process https://aric.isite.dev/dynamics/post/create-an-approval-process/ Wed, 11 Nov 2015 04:07:00 +0000 https://aric.isite.dev/index.php/2015/11/11/creating-an-approval-process/ We recently had a requirement to add an approval process to an entity in CRM. This was a custom entity (Comment), where most of the fields were prepopulated from the web, but had a response field that had to go through multiple layers of approval/review. The review process was once the comment was received it would be assigned to a responder, which would then go through manager, legal and director approvals.

The post Creating an Approval Process appeared first on Aric Levin's Digital Transformation Blog.

]]>
We recently had a requirement to add an approval process to an entity in CRM. This was a custom entity (Comment), where most of the fields were prepopulated from the web, but had a response field that had to go through multiple layers of approval/review. The review process was once the comment was received it would be assigned to a responder, which would then go through manager, legal and director approvals.

We first created a field in CRM called Review Stage that would hold the value of the review stage.

Added Submit for Approval, Approve and Lock/Unlock buttons using Ribbon Workbench. Thank you Scott DuBrow.

Ribbon Workbench

The next step is that we needed to provide custom privileges for the users. Some users would have multiple roles and particular rights that had to be validated before a particular action would need to be performed. Although this could have been done via Security Roles, the restrictions had to be per user, so we added certain security rights to the User Entity. We also considered doing this in custom entity, but since there were not that many specials privileges, we ended up adding the fields to the user entity.

Privileges

The different roles of the users required particular rights, such as which users could assign or reassign a record to another user, which users could submit for approval (mark complete), and who can approve. The main reasoning behind this, is that users could perform multiple actions (such as being a responder and a manager, or being a manager and a director).

Comment Flow

There were three main Actions (or custom messages) that would be called from the Comment record.
These would be Assign, Submit for Approval and Approve. Each of these functions would call the Action Process via JavaScript and would then execute the Plugin Message code.

The following is an example of the Approve Comment Action that is being called via JavaScript.

Action Properties

Once the Action is called, it will execute the Plugin code, which would validate whether the user has the proper privileges, and whether the user is the proper user in the chain of approvals.

Sample Code

When the plugin code finished executing, the form would refresh, and show the user that the process has advances to the next stage. We did this by creating a custom wizard web resource (looks similar to the Business Process Flow, but without all the bells and whistles). This would also allow us to lock down records once it reached certain stages of the approval process.

Final Flow

Since Directors are not really used to going through hundreds of records in CRM, we created an Export Process for them, that will allow them to export their comments to Excel. We restricted the user of the out-of-the-box Export to Excel feature, since we needed the Excel workbook to be protected/restricted to only certain columns. We created a process Console application called by a Task scheduler running on a separate application server. This process would check for requests from users for Excel exports and would generate the files and send them to the users via email. The process was implemented using the Infragistics Excel framework (an easier way to generate Excel files then using the Excel object model). The files would be locked for editing with the exception of the columns/rows that we needed.

We then created an SSIS import package using the SSIS Integration Toolkit for Microsoft Dynamics CRM (by Kingswaysoft) that allowed us to validate and upload the changes from the high level approvers that wanted to go over the responses in back into CRM.

The post Creating an Approval Process appeared first on Aric Levin's Digital Transformation Blog.

]]>