Visual Studio Archives - Aric Levin's Digital Transformation Blog https://aric.isite.dev/tag/visual-studio/ Microsoft Dynamics 365, Power Platform and Azure Thu, 12 May 2022 03:38:33 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Using Azure DevOps for Power Platform with a Self-Hosted Agent https://aric.isite.dev/azure/post/azure-devops-self-hosted-agent/ Sun, 22 Nov 2020 21:08:00 +0000 https://aric.isite.dev/index.php/2020/11/22/using-azure-devops-for-power-platform-with-a-self-hosted-agent/ There are various organizations that for whatever reason, whether it be trust or security are not comfortable using the Microsoft-hosted agents, and would like to use self-hosted agents within their own corporate environment.

In this article, I will go through the steps of creating a self-hosted agent, and then configuring a pipeline that will move your solution between the different environments.

The post Using Azure DevOps for Power Platform with a Self-Hosted Agent appeared first on Aric Levin's Digital Transformation Blog.

]]>
There are various organizations that for whatever reason, whether it be trust or security are not comfortable using the Microsoft-hosted agents, and would like to use self-hosted agents within their own corporate environment.

In this article, I will go through the steps of creating a self-hosted agent, and then configuring a pipeline that will move your solution between the different environments.

Let’s go ahead and start by creating our agent. The first thing that we have to do is create a personal access token. A personal access token is used as an alternate password to authenticate to Azure DevOps. Personal Access Tokens are easy to create and revoke them if they are no longer required.

We start by clicking on the User Settings Icon on the top right corner of our DevOps environment, and selecting Personal Access Tokens from the menu.

Azure DevOps - User Settings

If you don’t have any tokens, click on the New Token link button

Azure DevOps - Create New Token

This will pop up the Create a new personal access token panel. Provide a name for the token, select the organization and expiration date. You can create a custom defined scope and set the appropriate permissions that the token will have, or give it Full access to your DevOps environment. For the purpose of this article, we will provide it with Full access, but you can also select Agent Pools (read, manage) and deployment group (read, manage). You can click on the Show all scopes at the bottom of the panel to see all available authorization scopes for the personal access token. The image below shows the basic configuration.

Azure DevOps - Create New Access Token

Once the personal access token is created (and before you close your window), copy the token. You are going to need it later on. Microsoft does not store the token, so you will not be able to see it again.

Azure DevOps - New Token Success Confirmation

Next, we will go ahead and configure the Security for the Agent Pool. Azure DevOps organization owners and server administrators have permissions by default.

Navigate to the Agent Pools by clicking on the Azure DevOps logo on the top left, select Organization Settings on the bottom left, and then in the left hand navigation, click on Agent Pools. This will display the list of Agent Pools in the center pane. Select the pool that you want to configure, and click on the Security tab in the newly opened pane. If the user account that will be used is not shown, then make sure that you or an administrator can add that user account. You cannot add your own user account, but you should be able to add a group that you are a member of.

Azure DevOps - Agent Security

Now click on the Agents tab, and let’s create our agent. If there are no available agents, you will see a button on the screen (New agent) to create your first agent.

This will pop up a dialog where you can select your operating system and download the agent.

Azure DevOps - Download Agent

Now, that you have downloaded the agent, we can go ahead and install it and configure it. Extract the agent to a directory of your choice, and run the config.cmd command. In our case we extracted the agent files into the C:DevOpsAgent directory.

Azure DevOps - Extracted Agent Source Files

Once you run the config.cmd batch file, you will be asked for the Azure DevOps services url, which is https://dev.azure.com/{organization_name}, and then enter or paste the personal access token that you created previously.

Azure DevOps - Agent Configuration PowerShell Script

To determine whether you want to use the agent in Interactive Mode or Service Mode, you can click on the link below and check the differences on the Microsoft Docs website:

https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops&tabs=browser#interactive-or-service

Before we activate the Agent, let’s go ahead and create our first pipeline. We will make this pipeline simple that will include publishing the customizations in our dev environment, exporting the solution as managed and unmanaged and deploying this to our test environment as a managed solution.

Let’s go ahead and create a few pipelines to test this process.

Let’s go ahead and create our first pipeline. The first step is to connect to the Repo. We will use the classic editor to create a pipeline without YAML.

Azure DevOps - Connect to Source Code Repository

In the select source page, we will select our Azure Repos Git, and select the Team Project, the Repository and the Default branch that will be used for our manual or scheduled builds.

Azure DevOps - Select Repo Source

In the select template page, we will start with an Empty job.

Azure DevOps - Select Configuration Template

Next we will start configuring the pipeline and adding the tasks to be used with the agent. The image below shows the test pipeline that we created. The pipeline only uses two tasks, which is the Power Platform Tool Installer and the Power Platform Publish Customizations.

Azure DevOps - Test with MS Agent

Of course this is not a very realistic pipeline, as we usually need something that does more than just publish customizations, but the purpose is just to test that this is working with a Self-Hosted Agent. Just to make sure the pipeline works, we will first try it with the MS Agent.

I will go ahead and click the Save and Queue button and run this pipeline. After about 60 seconds, we get the results that the pipeline executed successfully.

Azure DevOps - Successful Test with MS Agent

Next, I will go ahead and edit the pipeline, and change the agent pool to the self-hosted agent pool that I created, click on the Save and Queue and then run the process.

Azure DevOps - Test with Self Hosted Agent

This should have been straight forward, but it seems like there were some issues with running this we encountered a few issues with the Tool Installer.

Azure DevOps - Failed Test with Self Hosted Agent

To get past this, (and after contacting Microsoft Support) we had to make some modifications to the pipeline, which included the installation of Nuget package and a Powershell script to Unregister the Package Source before running this.

Azure DevOps - Test with Self Hosted Agent including Fix

This issue that is addressed is related to the agent itself and not the PowerApps task. The following two links talk about this in further detail:

https://github.com/microsoft/PowerApps-Samples/issues/123#issuecomment-641053807

https://powerusers.microsoft.com/t5/Power-Apps-Pro-Dev-ISV/PowerPlatform-tool-installer-gives-me-a-nuget-error-on-private/m-p/607181/highlight/true#M2863

After making these changes, we were able to execute the package correctly.

Azure Devops - Successful Test with Fixed Self Hosted Agent

Even though this solves our problem, and enables us to go ahead and create a pipeline using Power Tools, the underlying problem still exists.

I received an update from Microsoft (Power Apps support) last week, that they were able to reproduce the issue using a local build agent, and based on the diagnostics log, what seemed to happen is that the machine where the agent was installed (a Windows 10 machine) had a PackageManagement and PowerShellGet in the older 1.0.0.1 versions, but the hosted AzureDevOps has more modern versions, 1.47 and 2.2x respectively.

Once Microsoft removed the newer versions they were able to reproduce this issue. As a temporary workaround, we were told that we could update these two modules for the build agent user and then retry. The following are the steps that are needed to follow:

  1. Install-PackageProvider -Name “NuGet” -Force -ForceBootstrap -Scope CurrentUser -MinimumVersion 2.8.5.208
  2. Install-Module -Name PackageManagement -MinimumVersion 1.4.7 -Scope CurrentUser -Force -AllowClobber
  3. Install-Module -Name PowerShellGet -MinimumVersion 2.2.5 -Scope CurrentUser -Force -AllowClobber

Since this issue was reported, the timeline for a fix is supposed to be coming early this week (Ween of 11/23/2020), when a new version will be published to the Visual Studio marketplace which will check and update the package providers.

Hopefully everyone which will be working with self-hosted agents can benefit from this update. I will update this post was the change has been implemented.

The post Using Azure DevOps for Power Platform with a Self-Hosted Agent appeared first on Aric Levin's Digital Transformation Blog.

]]>
Configure Azure Service Bus to integrate between CDS and On-Premise SQL database https://aric.isite.dev/azure/post/configure-asb-integrate-cds-sql/ Mon, 06 Jan 2020 07:17:00 +0000 https://aric.isite.dev/index.php/2020/01/06/configure-azure-service-bus-to-integrate-between-cds-and-on-premise-sql-database/ In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

There are various other ways to implement this, with the combination of Azure products such as Azure functions, Azure Logic Apps, On-Premise Data Gateway and Microsoft Flow, but those are not always available, especially when working in non-commercial environments, or when IT restricts what resources are available to the different departments of the agencies.

In order to implement this, there are a few prerequisites that have to be completed. These are set up the database server, write the console application to act as the event listener, install the Azure Service Bus, create plugin code to call the Azure Service Bus when a particular event happens. The logic of this article will be as follows: When an account record gets created or updated, it will call the Azure Service Bus in order to update an On-Premise SQL database that a new record is created or an existing record got updated.

Configuring the SQL database

We will start with the first step which is the configuration of the database. You can set up the database with a single table (to use only for Accounts), or add some related tables if necessary. We will also create a couple of stored procedures to Insert and Update the Accounts table. Links to code files will be shared at the end of the post. The image below displays the database structure. For the purpose of this articles, the Contacts and Users tables will not be required.

Azure Staging SQL Database with Tables and Stored Procedures

Create the Azure Service Bus Namespace

We can now create the Azure Service Bus. Login to your Azure Portal and search for Service Bus under the list of available Resources. Click on the Add button to add a new service bus.

Add New Azure Service Bus Namespace

This will open the create namespace window. Enter a name for your service bus in the Create Namespace window. It will append servicebus.windows.net to the name that you specify. Select a pricing tier, your subscription, resource group and location where you want this hosted. If you do not have a resource group, you can create a new one from this window.

Azure Service Bus Create Namespace

It will take a couple of minutes (or less) to create your namespace, and then the Azure Service Bus will be visible on your screen. The first thing that we need to do is check the Shared access policy. There is a default Shared Access policy that is available with the new Azure Service Bus that was created, and includes the three available claims: Manage, Send, Listen.

Send we only need Send and Listen, we will create a new Shared access policy, naming it IntegrationSharedAccessKey (or any other name that you would like) and set the available claims to Send and Listen. After you create your shared access policy, click on it to see the Keys and the Connection Strings. You will need them for configuring CDS and your Console application.

Configure the Service Endpoint in CDS

Next, let’s add this to out CDS environment, by running the Plugin Registration Tool. We will be using the Plugin Registration Tool for version 9.1. Run the Plugin Registration Tool and Create a new connection to your CDS environment. Click on Register and Select a new Service Endpoint. Copy the Connection String from your Service Bug Resource in Azure, and paste it in the popup window. Make sure that the Let’s Start with the connection string from the Azure Service Bus Portal… option is selected (as shown below).

Add Service Bus Endpoint Connection String

In the next window, Service Endpoint Registration make the following changes:

  • In the namespace address, change sb to https
  • In the Designation Type change to Two Way (you can also do One Way if you prefer, but in this particular examples we are using two-way.
  • Enter a path for the Service Endpoint. This will be used by the console application to listen to where changes are being made.
  • The Authorization Type should be set to SASKey, with the SASKeyName as the name of your Shared Access policy in Azure, and the SAS Key copied from the Primary Key in the Shared Access policy.

Azure Service Bus Service Endpoint Registration

After we have created the Service Endpoint, we need to capture the Guid of the Service Endpoint as we will use it from the plugin that we will soon start developing. In order to get the Guid of the Service Endpoint, click on the Properties tab, and scroll down till you see ServiceEndpointId property. Copy the Guid from there. We can hard code this in our application, or add it as Environmental Variable inside of CDS.

Add Environmental Variable in CDS

Navigate to a solution inside of Power Apps, select the solution and click on the New button on the command bar. Your will see the option to add new environmental variable. Enter a display name, schema name (automatically populated), description and data type. As this is going to store a Guid, you should use Text as the data type. You can enter a default value for your new EndpointId or specify a new value. The screenshot below shows how this is added.

Common Data Service Environmental Variable

As the Environmental Variables are still a new feature and there are some issues that have to be dealt with, you can use the Portal Settings entity or another Configuration entity to store your variables.

Writing the Plugin Code

We can now start developing the plugin code. This is a simple plugin that will run on the Account Create and Account Update messages. Our Plugin includes three code files: Plugin.cs, Account.cs and AccountLogic.cs.

The Plugin.cs is the standard tool generated Plugin.cs file that implements the IPlugin interface. There are only a couple of changes that were done to this class, since we need to communicate with Azure Service Bus.

We added an internal property called CloudService of type IServiceEndpointNotificationService. In the LocalPluginContext constructor, we set the value for the CloudService property.

Add Property for Cloud Service

Get Service for Azure Service Bus Listener

The Account Class adds the Create and Update events to the RegisteredEvents collection, and adds two functions: ExecutePostAccountCreate and ExecutePostAccountUpdate which get the Account Id and call the CreateAccount or UpdateAccount of the logic class.

Account Entity Plugin Logic

The AccountLogic class has a construction that takes four parameters: the Plugin Execution Context, Organization Service, Tracing Service and the Service Endpoint Notification Service. These are all used for the different purposes of the plugin.

Both the CreateAccount and UpdateAccount functions have the exact same logic. The difference is in the message which is going to be passed to the Azure Service Bus as part of the context.

The CreateUpdateAccount function Retrieves all the data from the Account record, gets the Endpoint Id (from the Enviromental Variables Entities), adds the Account record to the Shared Variables of the Context and finally calls the Azure Service Bus passing the context, which includes the data from the Account entity.

Create and Update Account

After the plugin receives the response, it writes the response to a note within the account record. After the plugin is complete, make sure to sign and compile it, and then register it with the plugin registration tool. When the assembly is registered, add two messages (one for Create and one for Update of the Account entity).

Plugin Registration Tool Account Plugin

Writing the Listener Windows Service Application
The final step is to work on the Console Application. We can start by writing the helper class that will connect to SQL Server and Create a new Account record. The first thing that we need to do is add a connection string to the App.Config that is part of the Console Application. If an App.Config file does not exist, you can create one.

In the Connection String section enter the following code:

<add name=”AzureStaging” connectionString=”Data Source={LOCALDBSERVERNAME};Initial Catalog=AzureStaging;User ID=azureadmin;Password=Azur3Adm1n;Persist Security Info=False” />

We create a SqlDataManager class which will call the SQL server stored procedures to Create and Update the account records. The functions will receive the values of the different fields in SQL Server and add them as Stored Procedure parameters as shown below:

Listener Create Account (SQL Data Manager)

Next we create the class to listen and process the incoming requests from Azure Service Bus. The class has a default public function called Execute which accepts a RemoteExecutionContext parameter. This parameters contains the Shared Variables that we passed from our plugin as well as the execution context that allows us to retrieve the message name to know if this is a create or an update.

Azure Service Bus Listener Entry Point

The CreateAccount and UpdateAccount functions receive the Account entity, take the necessary attributes and call the CreateAccount function of the SqlDataManager class in order to store the data in SQL Server.

Azure Service Bus Listener Create Account

We added the Service class to the project, which contains an eventlog component that will write errors to the Event Log. The OnStart Event will listen to events and the OnStop Event will stop listening to the Azure Service Bus.

Azure Service Bus Windows Service Start/Stop

The Console application will run as a Windows Service. The solution includes the Project Installer class which allows us to install this as a Windows Service class. The Main entry point of the application provides us with some testing capabilities as well as installation or uninstallation of the Windows Service.

We added a Program class that allows us to install and uninstall the Windows Service from within our Visual Studio debugger.

Install or Uninstall Windows Service from Visual Studio Debugger

Once the Windows Service is installed by running the VS application with the –install parameter, you will see the results in Services. Verify the Service is running, and depending whether you are testing this on your Local machine or on a Server, determine if to use the Local System account or the Network Service account.

Azure Service Bus Installed Windows Service

Now that we have done installing all the components, it is time to test all the parts working together. We tried running the first test, and the test did not have all the required fields. The description field was missing, so the result was a new note added to the timeline returning the error of a missing field from SQL Server.

Azure Service Bus Integration (End to End - First Try)

We then tried again, providing the Description field as well. This time we saw the success message in the timeline.

Azure Service Bus Integration (End to End - Second Try)

We wanted to verify that the data existed in SQL Server, so we ran a SQL Statement to validate the data, and received the following results:
New Record created in SQL Server. Results successful

Hope this will be a benefit for you. All the Plugin Source, Database Source and Listener Console Application Source Code is available on Github.
https://github.com/ariclevin/Integration/tree/master/ASBListener

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
Dynamics 365 and SharePoint Online Integration – Web Api Development https://aric.isite.dev/dynamics/post/dynamics-365-sharepoint-online-integration-webapi/ Wed, 28 Feb 2018 12:55:00 +0000 https://aric.isite.dev/index.php/2018/02/28/dynamics-365-and-sharepoint-online-integration-web-api-development/ dynamics-365-sharepoint-online-integration-webapi

The post Dynamics 365 and SharePoint Online Integration – Web Api Development appeared first on Aric Levin's Digital Transformation Blog.

]]>
Now that we have configured our SharePoint and Azure environments to allow us to create an API that will be called from CRM and process SharePoint requests, we will dive into the actual API.

If you haven’t developed any API projects with Visual Studio, I would suggest familiarizing yourself with Web API, and take a look at the following Microsoft Documentation: Getting Started with ASP.NET Web API 2 (C#).

The Web Api projects contains the following components which we will get into further details in this post:

  • References to Microsoft.SharePoint.Client and Microsoft.SharePoint.Client.Runtime
  • Reference to CAML Query Builder. This is a modified version of the CAML Query builder that is available on github: https://github.com/noormahdi/CAML-Query-Builder
  • Controller classes for DocumentLibrary and LookupItems
  • Model classes for LookupItem, SharePointDocument and SharePointFolder
  • Helper classes for dealing with File Upload, Data, Encryption, and SharePoint

As the API library contains a lot of different functionality, I will go through the process of retrieving the documents API to show in the subgrid. The full source code and instruction on how to use will be available soon on github, so that you can implement this in your own environment. We will start by taking a look at one of the HttpGet methods in our Api Library. This is the starting point that the user will execute against the Api and get a response of the list of documents that are available on SharePoint.

[HttpGet]
[Route("api/Library/GetByMasterId/{id}")]
public HttpResponseMessage GetByMasterId([FromUri] string id)
{
	string authorizationString = DecodeAuthorizationString();
	SPHelper.SetSharePointCredentials(authorizationString);

	List<SharePointDocument> files = new List<SharePointDocument>();

	ListItemCollection list = SPHelper.GetDocumentsById(id);
	if (list != null && list.AreItemsAvailable)
	{
		foreach (ListItem item in list)
		{
			SharePointDocument file = ListItemToSharePointDocument(item);
			files.Add(file);
		}
	}

	var response = Request.CreateResponse(HttpStatusCode.OK);
	response.Content = new StringContent(JsonConvert.SerializeObject(files), Encoding.UTF8, "application/json");
	return response;
}

The function starts by Retrieving the Authorization Credentials. This is either passed by the Web Resource from CRM as an encoded string as part of the request, or in our case stored in the application setting. The function checks if the Request.Authorization.Header contains a value. If it does it retrieves the string from the Authorization header, or if not it creates the Authorization string from the Application Settings Username and Password.

private string DecodeAuthorizationString()
{
	string userPass = string.Empty;
	if (Request.Headers.Authorization == null)
	{
		// throw HttpResponseHelper.GetUnauthorizedResponseException("Auth Header is null!");
		string emailAddress = ConfigurationManager.AppSettings["Email"].ToString();
		string password = ConfigurationManager.AppSettings["Password"].ToString();
		userPass = string.Format("{0}:{1}", emailAddress, password);
	}
	else
	{
		var authHeader = Request.Headers.Authorization;
		if (authHeader.Scheme.ToLower() != Constants.AUTH_HEADER.BASIC)
		{
			throw HttpResponseHelper.GetUnauthorizedResponseException("Auth Header is not using BASIC scheme!");
		}
		var encodedUserPass = authHeader.Parameter;
		userPass = Encoding.UTF8.GetString(Convert.FromBase64String(encodedUserPass));
	}
	return userPass;
}

We then call the Set SharePoint Credentials function passing the Authorization string. This is using a Helper Class that handles with connectivity to SharePoint and executing SharePoint Client Commands against SharePoint.

public static void SetSharePointCredentials(string authorization)
{
	string[] authCredentials = authorization.Split(':');
	ServiceUserName = authCredentials[0].ToString();
	ServicePassword = authCredentials[1].ToString();
}

The GetDocumentsById connects to SharePoint (if not already connected), creates a CAML query with a where clause to search by the Master Id and returns a ListItemCollection.

public static ListItemCollection GetDocumentsById(string masterId)
{
	ClientContext ctx = ConnectToSharePoint();

	List spList = ctx.Web.Lists.GetByTitle("Documents");
	ctx.Load(spList);
	ctx.ExecuteQuery();

	if (spList != null && spList.ItemCount > 0)
	{
		string whereClause =  String.Format(@"<Where>
			<Eq><FieldRef Name='MasterId' /><Value Type='Text'>{0}</Value></Eq>
			</Where> ", masterId);

		CamlQuery camlQuery = new CamlQuery();
		camlQuery.ViewXml = String.Format(
		   @"<View Scope='RecursiveAll'>
			<Query> 
				{0}
			</Query> 
			 <ViewFields>
				<FieldRef Name='FileLeafRef' />
				<FieldRef Name='Title' />
				<FieldRef Name='MasterId' />
				<FieldRef Name='MasterNumber' />
				<FieldRef Name='MasterName' />
			<FieldRef Name='DocumentType' />
				<FieldRef Name='Created' />
			</ViewFields> 
	</View>", whereClause);

		ListItemCollection listItems = spList.GetItems(camlQuery);
		ctx.Load(listItems);
		ctx.ExecuteQuery();

		return listItems;
	}
	else
		return null;

}

We then loop through the items of the collection, and convert each ListItem to a SharePointDocument. SharePointDocument is a Model class that contains the fields that are returned from SharePoint, and will be returned to the web resource as a Json string. The function below shows the ListItemToSharePointDocument function which handles the conversion.

private SharePointDocument ListItemToSharePointDocument(ListItem item)
{
	SharePointDocument file = new SharePointDocument();
	file.FileId = (item["UniqueId"].ToString().ToGuid());
	file.MasterId = item["MasterId"] != null ? item["MasterId"].ToString() : "";
	file.MasterNumber = item["MasterNumber"] != null ? item["MasterNumber"].ToString() : "";
	file.DocumentType = FieldLookupValueToLookupItem(item["DocumentType"]);
	file.MasterName = item["MasterName"] != null ? item["MasterName"].ToString() : "";
	file.FileName = item["FileLeafRef"].ToString();
	file.FilePath = item["FileRef"].ToString();

	return file;
}

Finally, we create the response passing a Status of OK (regardless whether or not files were returned), and serialize the files object as a Json string and return the response.

The list of functions that are included as part of the Api are:

  • GetByMasterId
  • GetByMasterNumber
  • GetByAlternateField
  • DownloadFile
  • UploadDocument
  • SetFileAttributes
  • DeleteFile
  • GetRootFolders
  • GetFolder
  • CreateRootFolder

The next and final articles in this blog series will cover the creation of the web resource.

 

 

The post Dynamics 365 and SharePoint Online Integration – Web Api Development appeared first on Aric Levin's Digital Transformation Blog.

]]>
Dynamics 365 and SharePoint Online Integration – Azure Configuration https://aric.isite.dev/dynamics/post/dynamics-365-sharepoint-online-integration-azure/ Mon, 26 Feb 2018 00:09:00 +0000 https://aric.isite.dev/index.php/2018/02/26/dynamics-365-and-sharepoint-online-integration-azure-configuration/ The next part of this solution is to configure Microsoft Azure. If you do not have an Azure account, you can create a Trial account of Pay as you go account. If you are a partner or have a BizSpark subscription you get a certain amount of credits per month, which should be more than enough for the type of implementation that needs to be done using Azure.

The post Dynamics 365 and SharePoint Online Integration – Azure Configuration appeared first on Aric Levin's Digital Transformation Blog.

]]>
The next part of this solution is to configure Microsoft Azure. If you do not have an Azure account, you can create a Trial account of Pay as you go account. If you are a partner or have a BizSpark subscription you get a certain amount of credits per month, which should be more than enough for the type of implementation that needs to be done using Azure.

The first thing that we need to do is create an Application, so that we can register the Client Id for the new Application. In order to create an application. In your left hand navigation click on Azure Active Directory, and select App Registrations from the list of options. Click on New application registration button shown below.

Azure App Registration

In the Create window, enter a Name for the application, the Application Type (Native), and the Redirect Uri (not really used in this case), and then click on the Create button. Once the application is registered, you will be able to see its properties in the Registered App window. You will need the Application Id to store in the Client Id window. Next, we will need to provide the application permissions to the SharePoint server. Click on the Settings button in the Registered app Pane. This will open the Settings window.

Azure App Registration

From there click on the Required permissions under API ACCESS section. By default you will only have the Windows Azure Active Directory permissions. Click on the Add button to add additional permissions. We will need to add Office 365 SharePoint Online (Microsoft.SharePoint) permissions. Select Office 365 SharePoint Online, and check all the required permissions. At the least, you will need the following permissions:

Azure App Registration Permissions

Now that we have registered the App in Azure, we need to create an App Service with Azure. The type of App that we need to create is an Api app, which allow the connectivity between Azure and SharePoint.

In the Microsoft Azure navigation click on App Services. This will display the AppService navigation as shown in the following image. Click on the Add button.

Add New App

In the Web Apps section shown below, click on the More button in order to see the selection for API App.

Select Api

Click on the API App link, and then click on the Create button.

Select API App

This will display the new API App window. Enter the name of the App, Subscription, Resource Group (either Add New or use an existing one) and the App Service Plan/Location. Click on Create button to finalize the Creation of the App. Once the app is created we will need to add some Application Settings to it as well as set up Cross-Origin Resource Sharing (CORS), which will allow Dynamics 365 code to execute the API functions.

In the Application Settings window, we will enter the following settings (as shown in the image and table below):

App Settings

Application Setting Application Setting Description
MAX_ITEMS_PER_FOLDER Enter the maximum number of files that can be entered in a particular SharePoint folder. This should be no more than 5000
SITE_URL The root url of your SharePoint environment. This will be in the format of: https://SPROOT.sharepoint.com
WEB_FULL_URL The url of the SharePoint document library. This will be in the format of https://SPROOT/sharepoint.com/sites/doccenter
ClientId The client id (or application id) that is generated by Azure for this application. See previous instructions
RedirectUrl You can set this as the Url of your CRM environment
Email An email account with access to SharePoint. This should most likely be set as a service account and not an individual user account
Password The password of the email account specified above. If you encrypt the password, you will have to modify the code that authenticates against SharePoint to decrypt the password

Finally, you will have to set up CORS, so that only CRM has access to this API. The screenshot below shows you the configuration of CORS. All that is required is the Url of the CORS

CORS Setup

In the next article of the series, we will go over the Web Api, and uploading it to Azure.

The post Dynamics 365 and SharePoint Online Integration – Azure Configuration appeared first on Aric Levin's Digital Transformation Blog.

]]>
Dynamics 365 and SharePoint Online Integration – SharePoint https://aric.isite.dev/dynamics/post/dynamics-365-sharepoint-online-integration-sp/ Sun, 25 Feb 2018 21:25:00 +0000 https://aric.isite.dev/index.php/2018/02/25/dynamics-365-and-sharepoint-online-integration-sharepoint/ The easiest part of this solution is to configure SharePoint. There is no custom development involved, but just the creation of the document library, the folders, custom lists and the attributes that we want to use for the solution. We will need to start with the creation of the custom lists as they will be used for the creation of the lookup attributes.

The post Dynamics 365 and SharePoint Online Integration – SharePoint appeared first on Aric Levin's Digital Transformation Blog.

]]>
The easiest part of this solution is to configure SharePoint. There is no custom development involved, but just the creation of the document library, the folders, custom lists and the attributes that we want to use for the solution. We will need to start with the creation of the custom lists as they will be used for the creation of the lookup attributes.

For our particular scenario, we create a Document Library called Document Center with a relative url of doccenter. In the root folder of the document center we created the following attributes (in addition to the default attributes):

MasterId, MasterNumber, MasterName, DocumentType

SharePoint Attributes

We also created 3 folders for ACCOUNTS, CASES and CONTACTS, though for the purpose of this series, we will only demonstrate the ACCOUNTS subgrid.

We also create a list called DocumentTypes which will be used by the DocumentType attribute to specify the type of document this is. No special configuration was done to the Document Type list, as we are using only the Title field.

We used the Master Id, Master Number and Master Name fields so that they can be used across the board for the various entities, however you can add additional fields or change them based on your requirements.

In the next section we will review how to configure the Microsoft Azure web site in order to have Azure connect to SharePoint.

The post Dynamics 365 and SharePoint Online Integration – SharePoint appeared first on Aric Levin's Digital Transformation Blog.

]]>
Dynamics 365 and SharePoint Online Integration – Overview https://aric.isite.dev/dynamics/post/dynamics-365-sharepoint-online-integration-overview/ Sun, 25 Feb 2018 21:05:00 +0000 https://aric.isite.dev/index.php/2018/02/25/dynamics-365-and-sharepoint-online-integration-overview/ Over the past few years we have encountered multiple scenarios where the Out of the Box SharePoint Integration component, did not fulfill the requirements of our clients using Dynamics 365 and SharePoint. Some of these had to do with capturing additional attributes in SharePoint, and sharing similar data across different entities in SharePoint.

The post Dynamics 365 and SharePoint Online Integration – Overview appeared first on Aric Levin's Digital Transformation Blog.

]]>
Over the past few years we have encountered multiple scenarios where the Out of the Box SharePoint Integration component, did not fulfill the requirements of our clients using Dynamics 365 and SharePoint. Some of these had to do with capturing additional attributes in SharePoint, and sharing similar data across different entities in SharePoint.

We ended up developing a solution that will solve some of this problems by using Azure API and the SharePoint Client Tools (can also be done with SharePoint API of course), and developing a custom web resource that will call the Azure API functions provide the ability to upload files to SharePoint as well as download files from SharePoint.

The image below shows the end result of the SharePoint Grid inside of a CRM entity.

SharePoint Grid

I will provide a separate post for each of the following portions of the application is the next week (or so), as well as post a copy of the different portions of the Source Code on github.

This series will include the following articles:

The post Dynamics 365 and SharePoint Online Integration – Overview appeared first on Aric Levin's Digital Transformation Blog.

]]>
Connecting to Dynamics CRM from Console Application – Part II https://aric.isite.dev/dynamics/post/connect-crm-online-console-ii/ Fri, 22 Dec 2017 17:28:00 +0000 https://aric.isite.dev/index.php/2017/12/22/connecting-to-dynamics-crm-from-console-application-part-ii/ In a recent article I explained how to establish a connection from a Console application to Microsoft Dynamics CRM On-Premise. In this post, I will cover the same steps on how to connect a Console application to Microsoft Dynamics CRM/365 Online using the Microsoft.Xrm.Sdk and Microsoft.Xrm.Tooling.Connector namespaces

The post Connecting to Dynamics CRM from Console Application – Part II appeared first on Aric Levin's Digital Transformation Blog.

]]>
In a recent article I explained how to establish a connection from a Console application to Microsoft Dynamics CRM On-Premise. In this post, I will cover the same steps on how to connect a Console application to Microsoft Dynamics CRM/365 Online using the Microsoft.Xrm.Sdk and Microsoft.Xrm.Tooling.Connector namespaces

The first thing that we need is to download the Microsoft Dynamics SDK from the Microsoft web site. Based on your version of CRM (or Dynamics 365), this url, might be different, but you can download the SDK from here. After you download and install the SDK, you will need to add references to the Microsoft.Xrm.Sdk and Microsoft.Xrm.Tooling.Connector assemblies to your Visual Studio Console application. These can be found in the SDKbin directory.

Once the references have been added, add the following lines of code to your namespace declaration in the Program.cs code window:

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Tooling.Connector;

When using the Xrm.Tooling.Connector namespace, we connect to Microsoft Dynamics CRM using the CrmServiceClient, but if you still require or want to use the OrganizationService you can use that after we establish connectivity to Dynamics CRM using the CrmServiceClient. We are going to show how to use the CrmServiceClient to connect using both a connection string, and individual settings.

To connect using a Connection String we will store the connection string in the App.Config file. The connection string will look as such:

  <connectionStrings>
    <add name="Office365"
         connectionString="Url=https://contoso.crm.dynamics.com; Username=crmadmin@contoso.onmicrosoft.com; Password=abcdef1234; authtype=Office365"/>
  </connectionStrings>

After we added the connection string to the App.Config file, we need to add code to our Main method that will establish the connection using the supplied string. We will perform this logic as follows:

string ConnectionString = ConfigurationManager.ConnectionStrings["Office365"].ConnectionString;
CrmServiceClient conn = new Microsoft.Xrm.Tooling.Connector.CrmServiceClient(connectionString);

if (conn.IsReady)
{
   // Perform Additional Actions Here
}

Another way to connect to Microsoft Dynamics CRM using CrmService Client is by providing parameters for credentials, organization url and organization name. The following is the Application Settings section within the App.Config file which we used to store the credentials and connection information. Note that password is not encrypted in the App Settings, but it should be encrypted one way or another. The following shows the AppSettings section in the App.Config file:

  <appSettings>
    <add key="UserName" value="crmadmin@contoso.onmicrosoft.com"/>
    <add key="Password" value="abcdef1234" />
    <add key="InternalUrl" value="contoso.crm.dynamics.com"/>
    <add key="OrgName" value="contosocrm"/>
    <add key="Region" value="NorthAmerica"/>
  </appSettings>

We can then read the application settings from our Program.cs Main function and establish a connection the same way we did with the connection string as follows:

string userName = ConfigurationManager.AppSettings["UserName"].ToString();
string password = ConfigurationManager.AppSettings["Password"].ToString();
string internalUrl = ConfigurationManager.AppSettings["InternalUrl"].ToString();
string orgName = ConfigurationManager.AppSettings["OrgName"].ToString();
string region = ConfigurationManager.AppSettings["Region"].ToString();

// Notice Authentication Type below is set to Office 365
NetworkCredential creds = new NetworkCredential(userName, ConvertToSecureString(password));
Microsoft.Xrm.Tooling.Connector.AuthenticationType authType = Microsoft.Xrm.Tooling.Connector.AuthenticationType.Office365;
CrmServiceClient conn = new Microsoft.Xrm.Tooling.Connector.CrmServiceClient(userName, ConvertToSecureString(password), region, orgName, isOffice365: true);

if (conn.IsReady)
{
   // Perform Additional Actions Here
}

As mentioned in the beginning, we are now able to use the CRMServiceClient class to perform actions against CRM. If we need to use the OrganizationService interface, we can use that as well, by adding the following line of code after setting the CrmServiceClient connection or after checking if the connection is ready as shown here:

IOrganizationService _orgService = (IOrganizationService)conn.OrganizationWebProxyClient != null ? (IOrganizationService)conn.OrganizationWebProxyClient : (IOrganizationService)conn.OrganizationServiceProxy;

At this point you can perform actions against the CRM Organization using either the Organization Service or Crm Service Client. In the next article, we will replicate the same example, but connect to Microsoft Dynamics CRM Online using the same methods.

The post Connecting to Dynamics CRM from Console Application – Part II appeared first on Aric Levin's Digital Transformation Blog.

]]>
Connecting to Dynamics CRM from Console Application – Part I https://aric.isite.dev/dynamics/post/connect-crm-onpremise-console-i/ Fri, 22 Dec 2017 03:14:00 +0000 https://aric.isite.dev/index.php/2017/12/22/connecting-to-dynamics-crm-from-console-application-part-i/ Recently I have seen a lot of posts on the CRM Community of how to establish connection to CRM using a Console application. In this post I will review the steps of establish a connection. I will focus on CRM On-Premise using ADFS/Claims/IFD, and will provide additional samples on connecting to Dynamics CRM Online in a separate blog article.

The post Connecting to Dynamics CRM from Console Application – Part I appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently I have seen a lot of posts on the CRM Community of how to establish connection to CRM using a Console application. In this post I will review the steps of establish a connection. I will focus on CRM On-Premise using ADFS/Claims/IFD, and will provide additional samples on connecting to Dynamics CRM Online in a separate blog article.

The first thing that we need is to download the Microsoft Dynamics SDK from the Microsoft web site. Based on your version of CRM (or Dynamics 365), this url, might be different, but you can download the SDK from here. After you download and install the SDK, you will need to add references to the Microsoft.Xrm.Sdk and Microsoft.Xrm.Tooling.Connector assemblies to your Visual Studio Console application. These can be found in the SDKbin directory.

Once the references have been added, add the following lines of code to your namespace declaration in the Program.cs code window:

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Tooling.Connector;

When using the Xrm.Tooling.Connector namespace, we connect to Microsoft Dynamics CRM using the CrmServiceClient, but if you still require or want to use the OrganizationService you can use that after we establish connectivity to Dynamics CRM using the CrmServiceClient. We are going to show how to use the CrmServiceClient to connect using both a connection string, and individual settings.

To connect using a Connection String we will store the connection string in the App.Config file. The connection string will look as such:

  <connectionStrings>
    <add name="Server=contoso.com, organization=advworks, user=crmadmin@contoso.local"
         connectionString="Url=https://internal.contoso.com/advworks/XRMServices/2011/Organization.svc; Username=contosocrmadmin; Password=abcdef1234; authtype=IFD"/>
  </connectionStrings>

After we added the connection string to the App.Config file, we need to add code to our Main method that will establish the connection using the supplied string. We will perform this logic as follows:

string connectionString = ConfigurationManager.ConnectionStrings[0].ConnectionString;
CrmServiceClient conn = new Microsoft.Xrm.Tooling.Connector.CrmServiceClient(connectionString);

if (conn.IsReady)
{
   // Perform Additional Actions Here
}

Another way to connect to Microsoft Dynamics CRM using CrmService Client is by providing parameters for credentials, organization url and organization name. The following is the Application Settings section within the App.Config file which we used to store the credentials and connection information. Note that password is not encrypted in the App Settings, but it should be encrypted one way or another. The following shows the AppSettings section in the App.Config file:

  <appSettings>
    <add key="UserName" value="contosocrmadmin"/>
    <add key="Password" value="abcdef1234" />
    <add key="InternalUrl" value="internal.contoso.com"/>
    <add key="OrgName" value="advworks"/>
  </appSettings>

We can then read the application settings from our Program.cs Main function and establish a connection the same way we did with the connection string as follows:

string userName = ConfigurationManager.AppSettings["UserName"].ToString();
string password = ConfigurationManager.AppSettings["Password"].ToString();
string internalUrl = ConfigurationManager.AppSettings["InternalUrl"].ToString();
string orgName = ConfigurationManager.AppSettings["OrgName"].ToString();

NetworkCredential creds = new NetworkCredential(userName, ConvertToSecureString(password));
                Microsoft.Xrm.Tooling.Connector.AuthenticationType authType = Microsoft.Xrm.Tooling.Connector.AuthenticationType.IFD;
CrmServiceClient conn = new Microsoft.Xrm.Tooling.Connector.CrmServiceClient(creds, authType, internalUrl, "443", orgName, true, true, null);

if (conn.IsReady)
{
   // Perform Additional Actions Here
}

As mentioned in the beginning, we are now able to use the CRMServiceClient class to perform actions against CRM. If we need to use the OrganizationService interface, we can use that as well, by adding the following line of code after setting the CrmServiceClient connection or after checking if the connection is ready as shown here:

IOrganizationService _orgService = (IOrganizationService)conn.OrganizationWebProxyClient != null ? (IOrganizationService)conn.OrganizationWebProxyClient : (IOrganizationService)conn.OrganizationServiceProxy;

At this point you can perform actions against the CRM Organization using either the Organization Service or Crm Service Client. In the next article, we will replicate the same example, but connect to Microsoft Dynamics CRM Online using the same methods.

The post Connecting to Dynamics CRM from Console Application – Part I appeared first on Aric Levin's Digital Transformation Blog.

]]>
SQL Server Data Tools for Visual Studio 2017 https://aric.isite.dev/sql/post/sql-server-data-tools-visual-studio-2017-preview/ Mon, 28 Aug 2017 23:55:00 +0000 https://aric.isite.dev/index.php/2017/08/28/sql-server-data-tools-for-visual-studio-2017/ The preview version of SSDT for Visual Studio 2017 (15.3.0 preview) is now available. This release introduces a standalone web installation experience for SQL Server Database, Analysis Services, Reporting Services, and Integration Services projects in Visual Studio 2017 15.3 or later.

The post SQL Server Data Tools for Visual Studio 2017 appeared first on Aric Levin's Digital Transformation Blog.

]]>
The preview version of SSDT for Visual Studio 2017 (15.3.0 preview) is now available. This release introduces a standalone web installation experience for SQL Server Database, Analysis Services, Reporting Services, and Integration Services projects in Visual Studio 2017 15.3 or later.

The preview release version of SSDT for Visual Studio 2017, should not be used for production environments at this point. An RTM date is not available at this point, but seems to be closer now that a preview is available.

The link below will redirect you to the download page.

Download SQL Server Data Tools (SSDT)

The release number is 15.3.0 preview, but the build number for this release is 14.0.16121.0

SSDT for Visual Studio 2017 has the same system requirements as installing VS, supported operating systems are Windows 7 SP1, Windows 8.1 or Windows Server 2012 R2, Windows 10 or Windows Server 2016, and is currently only available for the English Language.

The post SQL Server Data Tools for Visual Studio 2017 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Microsoft Dynamics 365 Portals Source Code Available for On-Premise Customers https://aric.isite.dev/dynamics/post/dynamics-365-portals-source-code-available-on-premise/ Sun, 27 Aug 2017 08:00:00 +0000 https://aric.isite.dev/index.php/2017/08/27/microsoft-dynamics-365-portals-source-code-available-for-on-premise-customers/ On August 24, Microsoft release the Microsoft Dynamics 365 Customer Engagement Portals Source Code. The download contains the self-hosted Portals which are available for Dynamics 365 On-Premise. This will allow your to customize the portals per your requirements and deploy then to Dynamics 365 (On-Premise) as well as Dynamics 365 (Online).

The post Microsoft Dynamics 365 Portals Source Code Available for On-Premise Customers appeared first on Aric Levin's Digital Transformation Blog.

]]>
On August 24, Microsoft release the Microsoft Dynamics 365 Customer Engagement Portals Source Code. The download contains the self-hosted Portals which are available for Dynamics 365 On-Premise. This will allow your to customize the portals per your requirements and deploy then to Dynamics 365 (On-Premise) as well as Dynamics 365 (Online).

The portals are compatible with the Online and On-Premise version 8.1 and later. The important thing to note is that this source code is provided to customers that are unable to deploy portals in an online environment, and are willing to maintain the portals code internally. Microsoft is not planning on supporting this code, which is also why it is provided under the open source license, and at the point of writing this, there are not acknowledgements of whether or not this will be supported in the future.

The download contains the solution files, the source code and the deployment instructions of the source code. The integration with the Portals web application can be implemented using a internal/organizational IIS server or using Microsoft Azure.

You can download the portals from the following link:

https://www.microsoft.com/en-us/download/details.aspx?id=55789

The post Microsoft Dynamics 365 Portals Source Code Available for On-Premise Customers appeared first on Aric Levin's Digital Transformation Blog.

]]>