SQL Archives - Aric Levin's Digital Transformation Blog https://aric.isite.dev/category/sql/ Microsoft Dynamics 365, Power Platform and Azure Thu, 12 May 2022 05:04:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Moving to Azure Synapse https://aric.isite.dev/sql/post/move-to-azure-synapse/ Tue, 06 Jul 2021 02:14:00 +0000 https://aric.isite.dev/index.php/2021/07/06/moving-to-azure-synapse/ I have been working with a client for the past year or so, where they have been using the Data Export Service to write data from their Dataverse environment to an Azure hosted SQL Server.

The post Moving to Azure Synapse appeared first on Aric Levin's Digital Transformation Blog.

]]>
I have been working with a client for the past year or so, where they have been using the Data Export Service to write data from their Dataverse environment to an Azure hosted SQL Server.

The Data Export service, for anyone who is not aware is a Microsoft solution that uses the Data Export Framework that allows the moving of data from Microsoft Dynamics 365/Dataverse to an Azure SQL Server (whether running on an Azure VM, Azure SQL or an Azure Managed instance). This solution works without the requirement for any custom development or SQL Server Integration Services.

The Data Export Service is a good solution, but it comes with its own fallbacks, especially with large customers. Those fallbacks include failures and delays in synching at points in time, complexities in troubleshooting, the inability to copy configuration from one environment to the next, such that would be expected in an ALM process.

Last year, Microsoft introduced an alternative option to the Data Export Service. This new alternative called Azure Synapse Link (or at the time Data Lake) allowed the option to easily export data from your Microsoft Dataverse environment to an Azure Data Lake Storage.

There is no deprecation notice for the Data Export service and that there are a lot of Microsoft customers that are still using it, but it seems like the direction that Microsoft is pushing customers is to use Azure Synapse Link (formerly Azure Data Lake) in order to sync the data between their Microsoft Dataverse environment and an Azure data source.

This Azure Data Lake Storage (Generation 2) is an Azure offering that provides the ability to store big data from analytics purposes. It is based on Azure Blob storage making it cost effective and robust. The next few steps will show you how to configure the Azure Storage V2 account that is required for setting up the link between Microsoft Dataverse and Azure Synapse Link.

The first step is to login to the Azure Portal by navigating to https://portal.azure.com, and then click on the Create a resource link icon and look for Storage account. Click on the create option for Storage account.

Create Resource - Azure Storage Account (Data Lake Gen 2)

Select a subscription and Resource group, enter a storage account, select the region and you can leave the rest of the settings as is. The screenshot below shows the first tab of the Storage account creation

Create Resource - Azure Storage Account (Data Lake Gen 2) - Basics tab

Do not click on the Review and create button at the bottom of the screen, but rather the Next: Advanced button to move to the advanced tab.

You will notice in the advanced tab that there is an section for Data Lake Storage Gen 2. Check the box to enable hierarchical namespace. The image below shows this tab.

Create Resource - Azure Storage Account (Data Lake Gen 2) - Advanced tab

You can skip the rest of the tabs and click on Review + create to finalize the creation of the Storage account. Once the Azure Storage account is configured, we can go ahead and start configuring the Azure Synapse Link in our Dataverse environment.

Create Resource - Azure Storage Account (Data Lake Gen 2) - Deployment Complete

Navigate back to your maker portal, by going to https://make.powerapps.com. Within your instance, expand the Data menu, and click on the Azure Synapse Link menu item. This will open the New link to data lake panel, where you can specify your Subscription, Resource Group and Storage account that you created for use with Azure Synapse Link. The image below shows this

Dataverse - Link to Data Link - Select Storage Account

You will notice that there is also an option to Connect to your Azure Synapse Analytics workspace, which is currently in preview. This allows us to bring the Dataverse data into Azure Synapse with just a few clicks, visualize the data within Azure Synapse Analytics workspace and then rapidly start processing the data to discover insights using features link serverless data lake exploration, code-free data integration, data flows for ETL pipelines and optimized Apache Spark for big data analytics.

Let’s go back to our Azure portal and create the Azure Synapse Analytics so that we can do this at the same time within database. In your Azure portal, click on the create a resource link again, and this time search of Azure Synapse Analytics.

Create Resource - Azure Synapse Analytics

This will open the Create Synapse workspace page on the Basics tab. Select your subscription and Resource group. You will also have to enter a name for the managed resource group. A managed resource group is a container that holds ancillary resources created by Azure Synapse Analytics for your workspace. By default, a managed resource group is created for you when your workspace is created.

Enter the name of the workspace and the region. The you will have to enter the name of the Storage Gen2 that we previously created and the name of the File System. If you do not have a file system name, click on the Create new link under it and provide a name, this will create the File system for you.

Create Azure Synapse Analytics - Basics tab

Do not click on the Review + create, but on the Next: Security button. You will have to provide a password for your Sql Server admin login.

Create Azure Synapse Analytics - Security  tab

You can now click on the Review + create, and then the create buttons to create the synapse workspace. This process takes a little longer then the creation of the storage account, as more resources are being created. Once deployment is done you can go to your new Azure Synapse Analytics resource by clicking on the Go to resource group button.

Create Azure Synapse Analytics - Deployment Complete

Let’s go back to our Maker portal and select the Connect to Azure Synapse Link again, but this time we will also provide the information for the workspace.

Check the Connect to your Azure Synapse Analytics workspace (preview), enter your Subscription, Resource group, Workspace name and Storage account. In the resource group make sure that you do not select the managed resource group, as that resource group does not have the workspace and the storage account associated with it.

Dataverse - Link to Data Link - Select Storage Account (with Workspace)

Once you click on the Next button, you can select the tables that you want to sync with Synapse Link. For this purpose we will only select a few tables (account and contact), but you can select as many tables as you want.

Dataverse - Link to Data Link - Add Tables

Finally, click on the Save button. This will create the connections and allow you to start synching and reviewing data in Synapse Link and Synapse Analytics. Once completed, the page will refresh and you will see the Linked data lake showing the Storage account that you previously created.

Now let’s start by going to Azure and see what happened when we clicked on the Save button. In our list of containers you will see various containers, one of them containing the name of your dataverse organization. Clicking to that container will show you a folder for each of the tables that you selected for your synchronization as well as a model.json file which contains your schema for the entities that you selected.

If you drill down into the selected entities, you will find a single csv file containing the initial dump from Dataverse. You can view and edit the file directly in Azure Blob or download it and open it in Excel. The data will contain all the fields that are part of the entity.

Azure Storage Account - Initial Sync results

Once we either add an additional record we will notice that a new file gets created corresponding to the month of the creation. If the record already exists in our Azure Blob environment, a new file will not be created, but it will modify the existing record.

Azure Storage Account - Sync Results after new record created

When modifying exists records, the changed record will get updated in the corresponding file where it currently exists. In our case, based on the record the changes will either be in the 2021-06 or 2021-07 file.

Now that we see that the files are created, let’s go ahead and see how we can view this data in Power BI. The file thing that we are going to need is to get the Endpoint of the Data Like storage. Within your new Storage account, in the left navigation under settings, click on Endpoints. Within the endpoints page, under the Data Lake Storage section, you will see that there are a Primary and Secondary endpoint Urls for Data Lake Storage. Copy the Primary endpoint url. This will be used within Power BI. This is shown in the image below:

Azure Storage Account - Endpoints - Data Lake

Next you are going to need to get the Access key for the Data Lake Storage Gen2. In the Azure Storage Account, under the Security + networking section, click on Access keys. This will show you the page containing a list of access keys. Click on the Show keys button, and the copy the value of the first key. You will need this for configuring Power BI.

Azure Storage Account - Access Keys

In Power BI, click on the Get Data button in the Ribbon, select Azure for the source of the data, and from the available Azure source select Azure Data Lake Storage Gen2 as shown in the image below:

Power BI - Get Data - Azure Data Lake Storage Gen2

Click on the Connect button, and in the new pop up window, enter the Data Lake Storage Primary endpoint url that you copied in the previous step and paste it in this window, select the CDM Folder view Beta and then click OK.

Power BI - Get Data - Url and Data View (CDM Folde View)

In the next window, you have the option to sign in using an Organizational account or the account key. Click on the account key. This is the access key that you copied from the previous step. After you enter the access key, you will see the following window, with the available data source results.

You will then see the navigator which provides you with a view of the model that you have in Azure Data Lake. Expand the Storage Container, and then expand the cdm hive. This will show you the list of entities that you have available there as tables. See the screenshot below.

Power BI - Get Data - Navigator

Finally from Power BI, you can start adding visualizations or fields, and customize your data as needed. In the screenshot below, we add a few fields from the accounts table in the Azure Storage account.

When we configured Azure Synapse link we checked the box for creating a Synapse workspace as well. If we navigate to the Synapse workspace that we created, we will be able to query the data from our Azure Storage account container from within the Synapse Analytics workspace. There are a lot of configuration options that are available in Azure Synapse Analytics workspace, such as configuring Analytics pools, encryption and firewall configuration and more. Those can be further reviewed in the Microsoft documentation, but for our purpose, we are going to take a look at the Synapse studio. The image below shows the Azure Synapse Workspace overview page, where we can click on the Open Synapse Studio to start querying our analytics.

Azure Synapse Analytics - Overview page

When Synapse Analytics Studio opens, there are a lot of available links on how to use this Azure product, and might be overwhelming, but we are just going to review the basics on how to retrieve or query data from the data warehouse. There are a few options that you can use to create a SQL Script against Synapse Analytics. You can click on the New button on the top of the page, and choose SQL script

Azure Synapse Analytics Studio - New SQL Script from Home Page

You can click on the Data icon on the left navigation, under the Home icon, which will allow you to expand into the data structure of the data store, and from there click on the account table, choose New SQL script and then choose the Select TOP 100 rows. This will create the SQL script for you and you can start editing it from there.

Azure Synapse Analytics Studio - New SQL Script from Account Table Data

The last option is clicking on the Develop icon in the left navigation, then clicking on the + button and selecting SQL Script as shown below:

Azure Synapse Analytics Studio - New SQL Script from Develop page

Once we have selected the query option that we want to use, we can go ahead and build our query. In our case we are just going to retrieve the list of accounts as shown in the screenshot below:

Azure Synapse Analytics - Create Query for Account table

When we click on the Run button we will be able to see the results as shown below.

Azure Synapse Analytics - Create Query Results for Account table

There are a lot more possibilities and options that are available for how to use Azure Synapse Link and Azure Synapse Analytics and accessing the data from different sources. This articles provides the basic review on how to configure them and make them work for your Dataverse environment. I hope this was helpful.

The post Moving to Azure Synapse appeared first on Aric Levin's Digital Transformation Blog.

]]>
CRUD Support for Virtual Tables https://aric.isite.dev/development/post/crud-support-virtual-tables/ Mon, 05 Apr 2021 08:45:00 +0000 https://aric.isite.dev/index.php/2021/04/05/crud-support-for-virtual-tables/ It seems like this was only a few days ago, but based on the sample that was published by Microsoft it’s been almost 2 weeks since Microsoft released CRUD support on Virtual tables.

The post CRUD Support for Virtual Tables appeared first on Aric Levin's Digital Transformation Blog.

]]>
It seems like this was only a few days ago, but based on the sample that was published by Microsoft it’s been almost 2 weeks since Microsoft released CRUD support on Virtual tables.

To tell you the truth, I have not really worked with Virtual Entities, ever since it was released, and I would have a lot of reasons to actually use them, it just never seemed the right time. In the recent release, I actually did my own little PoC to determine the complexity, and if it was always that simple, probably something to regret.

Let’s jump into it. Virtual tables (or Virtual entities) have been available for quite some time now, but only for read access. Now with the addition for Create, Update and Delete, this gives us for access to integrating between our Dataverse environment and a publicly available SQL Server (or other platform). In today’s post we will show how to go through the process of creating the SQL Server table, creating the plugin, data provider, data source and finally the virtual table an connecting all the pieces together.

All of the source code in this post is shown in images, but will be available via the Github Link at the bottom of the post.

Let’s start with the database. In this case I used an Azure SQL database, and created a table called Service Request. The script for creating the table is shown below.

Virtual Tables - Create Table in SQL Server

Next we create a Plugin Project in Visual Studio. You will need to install the following packages in your project in order to get access to all of the required resources (per Microsoft documentation):

  • Microsoft.CrmSdk.CoreAssemblies
  • Microsoft.CrmSdk.Data
  • Microsoft.CrmSdk.Deployment
  • Microsoft.CrmSdk.Workflow
  • Microsoft.CrmSdk.XrmTooling.CoreAssembly
  • Microsoft.IdentityModel.Clients.ActiveDirectory
  • Microsoft.Rest.ClientRuntime
  • Newtonsoft.Json

Within the plugin we will create 6 Classes. The first is a static class containing the Connection code, and the other classes are each for a message (Create, Retrieve, RetrieveMultiple, Update, Delete). You can create them as a single file or in multiple files. For the demonstration they are all created in separate files.

The following using statements have been added to all files, although not all is required for all classes.

Virtual Tables - Using Statements

Let’s start with the Connection class. We use the SqlConnectionStringBuilder class to generate the Connection string and return it to the rest of the classes that will consume it. This is shown in the image below.

Virtual Tables - Static Connection Class in Plugin

Next, let’s look at the CRUD classes. All of the classes implement the IPlugin interface and call the Execute function of that Interface. In each of the class based on the required functionality we retrieve the Execution Context in order to read data from the Dataverse environment, and then synchronize between our Dataverse environment and our Azure SQL Server. The screenshots below show the code for each of these.

Create Class:
Virtual Tables - Create Class

Retrieve Class:
Virtual Tables - Retrieve Class

Retrieve Multiple Class:
Virtual Tables - Retrieve Multiple Class

Update Class:
Virtual Tables - Update Class

Delete Class:
Virtual Entities - Delete Class

Once the code for the Plugin is completed we will go ahead and register the plugin via the Plugin Registration Tool. Before registering the plugin, make sure that you have the latest version of the Plugin Registration Tool that provides support for CRUD operations in virtual tables. At the time of writing this post, the required version was 9.1.0.24. You can find the download link below:

https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool

Now let’s go ahead and register the Plugin. This should include all of the five messages above. Once the Plugin is registered, we go ahead and Register a new Data Provider (using the Plugin Registration Tool).

Click on the Register button, and select Register New Data Provider (Ctrl + P).

When selecting the Data Source Entity for the first time, choose the Create New from the drop down. It is important to select the solution that contains the publisher that has the prefix that you want to use, or you will have to modify you code for something else.

Virtual Tables - Plugin Registration Tool - New Data Provider

Once we are done with creating the data provider in the Plugin Registration Tool, we can go ahead and start setting up the Virtual Table in our Dataverse environment. This functionality is not available yet in the Maker Portal, so we will have to do this in the Classic interface (Advanced Settings).

Navigate to Settings and Administration, and select the Virtual Entity Data Sources, and click on the New button. A popup will appear with a drop down to select a Data Provider as shown in the image below.

Virtual Tables - Select Data Provider (Dataverse)

Select the name of the Data Provider that you selected, and click on the OK button. Then you will be able to provide a name for the Data Source.

Click on Settings again and select Solutions. Select the name of the solution where you want to create the fields for this table. Check the Virtual Entity option, and select the Data Source that you just created. Enter a Display Name, Plural Name and Name for the entity. You will notice that there are two additional Names which are the External Name and the External Collection Name. For the External Name, you should enter the name of the Source table. The External Collections name can contain the same values as the Plural table name.

Virtual Tables - Create Virtual Table (Dataverse)

Once you have finished creating the table, go ahead and create all of the columns that you are going to use to view or update within the Virtual Entity. Only add columns that you want to either read from your Azure SQL table or that you want to write back. Any columns that are for management only are not really required. The image below show the columns that were created for our test:

Virtual Tables - Table Columns (Dataverse)

Next, we need to add this entity to our Model Driven App, so that we can test this out. Select an existing MDA or create a new one and add the entity to it.

Finally go ahead and test the results.

Virtual Tables - Demo

The animated screenshot above shows adding, reading and updating entity fields using CRUD operations.

You can click on the link below to get access to the Source Code:
https://github.com/ariclevin/VirtualTables

The post CRUD Support for Virtual Tables appeared first on Aric Levin's Digital Transformation Blog.

]]>
T-SQL Endpoint for Common Data Service https://aric.isite.dev/sql/post/t-sql-endpoint-common-data-service/ Tue, 02 Jun 2020 11:00:00 +0000 https://aric.isite.dev/index.php/2020/06/02/t-sql-endpoint-for-common-data-service/ Since the announcement by Microsoft at the Microsoft Business Applications Summit last month, and even before, I’ve been eager to take a look at the new SQL Connection for the Common Data Service Endpoint. What this means is that we can not write and execute SQL queries against the entity data.

The post T-SQL Endpoint for Common Data Service appeared first on Aric Levin's Digital Transformation Blog.

]]>
Since the announcement by Microsoft at the Microsoft Business Applications Summit last month, and even before, I’ve been eager to take a look at the new SQL Connection for the Common Data Service Endpoint. What this means is that we can not write and execute SQL queries against the entity data.

In order to get started, the first thing that has to be done is enabling the Endpoint, so that we can create a connection from SQL Server Management Studio (SSMS) to our Dynamics/CDS environment. As of now, there are two ways to enable this endpoint.

The first is by navigating to our environment in the Power Platform Admin Center. Click on the settings in the Command bar for the environment, expand the Product collapsed section and click on Features. On the features page change the toggle for TDS endpoint from Off to On.

TDS endpoint - Power Platform Admin Center

The other way to get the same results is by using the SQL 4 CDS Plugin for XrmToolBox by Mark Carrington (make sure that you have the latest version). Open the Plugin and connect to your environment. In the Object Explorer pane, under entities, you will see the T-SQL Endpoint tree option.

TDS endpoint (Xrm ToolBox - SQL 4 CDS)

Right click on the T-SQL Endoint (Disabled), and select Enable:

T-SQL endpoint - Xrm ToolBox - SQL 4 CDS - Enable endpoint

Making this change will allow you to make connections to your CDS environment using the new T-SQL endpoint.

The next thing is we need to connect using SQL Server Management Studio. I am using SQL v18.5, but you should be able to use it with some of the prior versions. The main thing is how we connect to our cloud database.

Within your SSMS, in Object Explorer, click on Connect and select Database Engine. This will pop up the Connect to Server window. In the Server name, enter the Organization Name following by the region and the port. The format would be as follows:
orgname.crm{region}.dynamics.com,5558

If your organization name is crmdemo and your are in North America, the Server name would be: crmdemo.crm.dynamics.com,5558

Next, we have to provide the authentication method. Based on your connection there might be a few different options for you to use. You will need to choose one of the three Azure Active Directory Authentication methods based on your connection settings.

I have tested this out with both Azure Active Directory – Password, which is used when you are connecting using a Username (email) and Password as well Azure Active Directory – Universal with MFA, which will user your organization’s MFA settings to connect.

In this post, I will be using the Azure Active Directory – Password. If you are the only one using the computer, you can check the Remember Password. The screen below (with a few blurs), shows how the connection dialog will look like.

T-SQL Endpoint - SSMS Connect to Server

Finally, our Object Explorer can be expanded, and we will be able to see all of the available tables that we can query on. The image below shows a partial list of the tables that are available:

T-SQL endpoint - SSMS Object Explorer

A few things to note.

  • This is still in preview and not available in all regions
  • The database is in Read-Only state, so no updates can be performed (I am not sure if and when that will change)
  • Only tables and table columns are provided, but there are no custom views or stored procedures. The tables also display the name attributes from related entities.
  • Although what we are seeing in our database shows under tables, these are actually the filtered views.
  • The supported SQL operations include: Batch operations, SELECT, aggregation functions, UNIONs, JOINs and Filtering

Finally, I tested a few queries out including JOIN and Filtering options and got the results just as expected. You can see the screenshot below.

T-SQL endpoint - SSMS Query and results

Hopefully you are excited about this great new enhancement.

The post T-SQL Endpoint for Common Data Service appeared first on Aric Levin's Digital Transformation Blog.

]]>
Configure Azure Service Bus to integrate between CDS and On-Premise SQL database https://aric.isite.dev/azure/post/configure-asb-integrate-cds-sql/ Mon, 06 Jan 2020 07:17:00 +0000 https://aric.isite.dev/index.php/2020/01/06/configure-azure-service-bus-to-integrate-between-cds-and-on-premise-sql-database/ In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
In this blog post I will demonstrate how to use Azure Service Bus and a Listener application to integrate between the Common Data Service (Dynamics 365 or Power Apps Model Driven Application) and an On Premise SQL Service database.

There are various other ways to implement this, with the combination of Azure products such as Azure functions, Azure Logic Apps, On-Premise Data Gateway and Microsoft Flow, but those are not always available, especially when working in non-commercial environments, or when IT restricts what resources are available to the different departments of the agencies.

In order to implement this, there are a few prerequisites that have to be completed. These are set up the database server, write the console application to act as the event listener, install the Azure Service Bus, create plugin code to call the Azure Service Bus when a particular event happens. The logic of this article will be as follows: When an account record gets created or updated, it will call the Azure Service Bus in order to update an On-Premise SQL database that a new record is created or an existing record got updated.

Configuring the SQL database

We will start with the first step which is the configuration of the database. You can set up the database with a single table (to use only for Accounts), or add some related tables if necessary. We will also create a couple of stored procedures to Insert and Update the Accounts table. Links to code files will be shared at the end of the post. The image below displays the database structure. For the purpose of this articles, the Contacts and Users tables will not be required.

Azure Staging SQL Database with Tables and Stored Procedures

Create the Azure Service Bus Namespace

We can now create the Azure Service Bus. Login to your Azure Portal and search for Service Bus under the list of available Resources. Click on the Add button to add a new service bus.

Add New Azure Service Bus Namespace

This will open the create namespace window. Enter a name for your service bus in the Create Namespace window. It will append servicebus.windows.net to the name that you specify. Select a pricing tier, your subscription, resource group and location where you want this hosted. If you do not have a resource group, you can create a new one from this window.

Azure Service Bus Create Namespace

It will take a couple of minutes (or less) to create your namespace, and then the Azure Service Bus will be visible on your screen. The first thing that we need to do is check the Shared access policy. There is a default Shared Access policy that is available with the new Azure Service Bus that was created, and includes the three available claims: Manage, Send, Listen.

Send we only need Send and Listen, we will create a new Shared access policy, naming it IntegrationSharedAccessKey (or any other name that you would like) and set the available claims to Send and Listen. After you create your shared access policy, click on it to see the Keys and the Connection Strings. You will need them for configuring CDS and your Console application.

Configure the Service Endpoint in CDS

Next, let’s add this to out CDS environment, by running the Plugin Registration Tool. We will be using the Plugin Registration Tool for version 9.1. Run the Plugin Registration Tool and Create a new connection to your CDS environment. Click on Register and Select a new Service Endpoint. Copy the Connection String from your Service Bug Resource in Azure, and paste it in the popup window. Make sure that the Let’s Start with the connection string from the Azure Service Bus Portal… option is selected (as shown below).

Add Service Bus Endpoint Connection String

In the next window, Service Endpoint Registration make the following changes:

  • In the namespace address, change sb to https
  • In the Designation Type change to Two Way (you can also do One Way if you prefer, but in this particular examples we are using two-way.
  • Enter a path for the Service Endpoint. This will be used by the console application to listen to where changes are being made.
  • The Authorization Type should be set to SASKey, with the SASKeyName as the name of your Shared Access policy in Azure, and the SAS Key copied from the Primary Key in the Shared Access policy.

Azure Service Bus Service Endpoint Registration

After we have created the Service Endpoint, we need to capture the Guid of the Service Endpoint as we will use it from the plugin that we will soon start developing. In order to get the Guid of the Service Endpoint, click on the Properties tab, and scroll down till you see ServiceEndpointId property. Copy the Guid from there. We can hard code this in our application, or add it as Environmental Variable inside of CDS.

Add Environmental Variable in CDS

Navigate to a solution inside of Power Apps, select the solution and click on the New button on the command bar. Your will see the option to add new environmental variable. Enter a display name, schema name (automatically populated), description and data type. As this is going to store a Guid, you should use Text as the data type. You can enter a default value for your new EndpointId or specify a new value. The screenshot below shows how this is added.

Common Data Service Environmental Variable

As the Environmental Variables are still a new feature and there are some issues that have to be dealt with, you can use the Portal Settings entity or another Configuration entity to store your variables.

Writing the Plugin Code

We can now start developing the plugin code. This is a simple plugin that will run on the Account Create and Account Update messages. Our Plugin includes three code files: Plugin.cs, Account.cs and AccountLogic.cs.

The Plugin.cs is the standard tool generated Plugin.cs file that implements the IPlugin interface. There are only a couple of changes that were done to this class, since we need to communicate with Azure Service Bus.

We added an internal property called CloudService of type IServiceEndpointNotificationService. In the LocalPluginContext constructor, we set the value for the CloudService property.

Add Property for Cloud Service

Get Service for Azure Service Bus Listener

The Account Class adds the Create and Update events to the RegisteredEvents collection, and adds two functions: ExecutePostAccountCreate and ExecutePostAccountUpdate which get the Account Id and call the CreateAccount or UpdateAccount of the logic class.

Account Entity Plugin Logic

The AccountLogic class has a construction that takes four parameters: the Plugin Execution Context, Organization Service, Tracing Service and the Service Endpoint Notification Service. These are all used for the different purposes of the plugin.

Both the CreateAccount and UpdateAccount functions have the exact same logic. The difference is in the message which is going to be passed to the Azure Service Bus as part of the context.

The CreateUpdateAccount function Retrieves all the data from the Account record, gets the Endpoint Id (from the Enviromental Variables Entities), adds the Account record to the Shared Variables of the Context and finally calls the Azure Service Bus passing the context, which includes the data from the Account entity.

Create and Update Account

After the plugin receives the response, it writes the response to a note within the account record. After the plugin is complete, make sure to sign and compile it, and then register it with the plugin registration tool. When the assembly is registered, add two messages (one for Create and one for Update of the Account entity).

Plugin Registration Tool Account Plugin

Writing the Listener Windows Service Application
The final step is to work on the Console Application. We can start by writing the helper class that will connect to SQL Server and Create a new Account record. The first thing that we need to do is add a connection string to the App.Config that is part of the Console Application. If an App.Config file does not exist, you can create one.

In the Connection String section enter the following code:

<add name=”AzureStaging” connectionString=”Data Source={LOCALDBSERVERNAME};Initial Catalog=AzureStaging;User ID=azureadmin;Password=Azur3Adm1n;Persist Security Info=False” />

We create a SqlDataManager class which will call the SQL server stored procedures to Create and Update the account records. The functions will receive the values of the different fields in SQL Server and add them as Stored Procedure parameters as shown below:

Listener Create Account (SQL Data Manager)

Next we create the class to listen and process the incoming requests from Azure Service Bus. The class has a default public function called Execute which accepts a RemoteExecutionContext parameter. This parameters contains the Shared Variables that we passed from our plugin as well as the execution context that allows us to retrieve the message name to know if this is a create or an update.

Azure Service Bus Listener Entry Point

The CreateAccount and UpdateAccount functions receive the Account entity, take the necessary attributes and call the CreateAccount function of the SqlDataManager class in order to store the data in SQL Server.

Azure Service Bus Listener Create Account

We added the Service class to the project, which contains an eventlog component that will write errors to the Event Log. The OnStart Event will listen to events and the OnStop Event will stop listening to the Azure Service Bus.

Azure Service Bus Windows Service Start/Stop

The Console application will run as a Windows Service. The solution includes the Project Installer class which allows us to install this as a Windows Service class. The Main entry point of the application provides us with some testing capabilities as well as installation or uninstallation of the Windows Service.

We added a Program class that allows us to install and uninstall the Windows Service from within our Visual Studio debugger.

Install or Uninstall Windows Service from Visual Studio Debugger

Once the Windows Service is installed by running the VS application with the –install parameter, you will see the results in Services. Verify the Service is running, and depending whether you are testing this on your Local machine or on a Server, determine if to use the Local System account or the Network Service account.

Azure Service Bus Installed Windows Service

Now that we have done installing all the components, it is time to test all the parts working together. We tried running the first test, and the test did not have all the required fields. The description field was missing, so the result was a new note added to the timeline returning the error of a missing field from SQL Server.

Azure Service Bus Integration (End to End - First Try)

We then tried again, providing the Description field as well. This time we saw the success message in the timeline.

Azure Service Bus Integration (End to End - Second Try)

We wanted to verify that the data existed in SQL Server, so we ran a SQL Statement to validate the data, and received the following results:
New Record created in SQL Server. Results successful

Hope this will be a benefit for you. All the Plugin Source, Database Source and Listener Console Application Source Code is available on Github.
https://github.com/ariclevin/Integration/tree/master/ASBListener

The post Configure Azure Service Bus to integrate between CDS and On-Premise SQL database appeared first on Aric Levin's Digital Transformation Blog.

]]>
SQL Server Data Tools for Visual Studio 2017 https://aric.isite.dev/sql/post/sql-server-data-tools-visual-studio-2017-preview/ Mon, 28 Aug 2017 23:55:00 +0000 https://aric.isite.dev/index.php/2017/08/28/sql-server-data-tools-for-visual-studio-2017/ The preview version of SSDT for Visual Studio 2017 (15.3.0 preview) is now available. This release introduces a standalone web installation experience for SQL Server Database, Analysis Services, Reporting Services, and Integration Services projects in Visual Studio 2017 15.3 or later.

The post SQL Server Data Tools for Visual Studio 2017 appeared first on Aric Levin's Digital Transformation Blog.

]]>
The preview version of SSDT for Visual Studio 2017 (15.3.0 preview) is now available. This release introduces a standalone web installation experience for SQL Server Database, Analysis Services, Reporting Services, and Integration Services projects in Visual Studio 2017 15.3 or later.

The preview release version of SSDT for Visual Studio 2017, should not be used for production environments at this point. An RTM date is not available at this point, but seems to be closer now that a preview is available.

The link below will redirect you to the download page.

Download SQL Server Data Tools (SSDT)

The release number is 15.3.0 preview, but the build number for this release is 14.0.16121.0

SSDT for Visual Studio 2017 has the same system requirements as installing VS, supported operating systems are Windows 7 SP1, Windows 8.1 or Windows Server 2012 R2, Windows 10 or Windows Server 2016, and is currently only available for the English Language.

The post SQL Server Data Tools for Visual Studio 2017 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Permission accessing SSRS Report Server/Manager https://aric.isite.dev/sql/post/permission_accessing_ssrs_report_server/ Fri, 04 Apr 2014 18:50:00 +0000 https://aric.isite.dev/index.php/2014/04/04/permission-accessing-ssrs-report-server-manager/ Recently we have encountered an issue where a domain admin (and local admin) account was trying to access the SSRS Report Manager and Report Server web sites and was getting the following error:
User DOMAINUserAccount does not have the required permissions. Verify that the permissions have been grated and Windows User Account Control (UAC) have been addressed.

The post Permission accessing SSRS Report Server/Manager appeared first on Aric Levin's Digital Transformation Blog.

]]>
Recently we have encountered an issue where a domain admin (and local admin) account was trying to access the SSRS Report Manager and Report Server web sites and was getting the following error:
User DOMAINUserAccount does not have the required permissions. Verify that the permissions have been grated and Windows User Account Control (UAC) have been addressed.

The easiest solution to the problem is to disable UAC on that machine (a restart will be required).
After that you should be able to access to SSRS Report Manager and Report Server web sites.

If you need to keep UAC running, the following KB article details the configuration required for the installation to both install and run SSRS:
http://support.microsoft.com/kb/934164

The post Permission accessing SSRS Report Server/Manager appeared first on Aric Levin's Digital Transformation Blog.

]]>
Deploy a Report Builder 3.0/VS 2010 Report on SSRS 2008 https://aric.isite.dev/dynamics/post/deploy_vs2010_report_on_ssrs2008/ Tue, 19 Jul 2011 23:31:00 +0000 https://aric.isite.dev/index.php/2011/07/19/deploy-a-report-builder-3-0-vs-2010-report-on-ssrs-2008/ Visual Studio 2010 and Report Builder 3.0 RDL reports use a new schema that is not supported by SQL Server 2008 Reporting Services. They are designed to work only with SSRS 2008 R2. Currently there is no way to publish a VS2010 Report on SQL Server 2008 without some modifications to the RDL code.

The post Deploy a Report Builder 3.0/VS 2010 Report on SSRS 2008 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Visual Studio 2010 and Report Builder 3.0 RDL reports use a new schema that is not supported by SQL Server 2008 Reporting Services. They are designed to work only with SSRS 2008 R2. Currently there is no way to publish a VS2010 Report on SQL Server 2008 without some modifications to the RDL code.

Visual Studio 2010 and Report Builder 3.0 RDL reports use a new schema that is not supported by SQL Server 2008 Reporting Services. They are designed to work only with SSRS 2008 R2. Currently there is no way to publish a VS2010 Report on SQL Server 2008 without some modifications to the RDL code.

In order to publish the RDL in SSRS 2008, the following steps have to be performed:

1. Open the RDL file (not project) with Visual Studio or an Xml Editor.

2. Replace the report namespace occurrences from 2010 to 2008:

<Report xmlns_rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner" 
  xmlns_cl="http://schemas.microsoft.com/sqlserver/reporting/2010/01/componentdefinition" 
  xmlns="http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition">

Should be replaced with:

<Report xmlns_rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner" 
  xmlns_cl="http://schemas.microsoft.com/sqlserver/reporting/2008/01/componentdefinition" 
  xmlns="http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition">

 

3. Report the following Xml Tags (Keep the Xml between the Tags)

<ReportSections>
  <ReportSection>
  </ReportSection>
</ReportSections>

4. Save the Xml file and publish the report. This should now work with SSRS 2008.

The post Deploy a Report Builder 3.0/VS 2010 Report on SSRS 2008 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Report Viewer Control not displaying data when deployed IIS7 https://aric.isite.dev/dynamics/post/report_viewer_control_not_displaying_data_in_iis7/ Fri, 05 Nov 2010 21:24:00 +0000 https://aric.isite.dev/index.php/2010/11/05/report-viewer-control-not-displaying-data-when-deployed-iis7/ This issue sometimes happens when you deploy a web application that contains report files on IIS7.
The reports work fine in the development environment, but once published nothing appears in the results.

The post Report Viewer Control not displaying data when deployed IIS7 appeared first on Aric Levin's Digital Transformation Blog.

]]>
This issue sometimes happens when you deploy a web application that contains report files on IIS7.
The reports work fine in the development environment, but once published nothing appears in the results.

This issue sometimes happens when you deploy a web application that contains report files on IIS7. 
The reports work fine in the development environment, but once published nothing appears in the results.
The report window will either complete and show zero pages or the “Report is being generated” message will show up:

 

The problem is that the web browser is unable to locate Reserved.ReportViewerWebControl.axd, even though the http handler might be included in the web.config file.

To verify that this is the problem, open fiddler web debugger in your Internet Explorer session and check for any 404 errors with the Reserved.ReportViewerWebControl.axd url.

 

To resolve this issue we have to manually add the http handler to the web site in IIS. Report Viewer 2008 Redistributable with SP1 should be installed of course.

  1. Open IIS.
  2. Click on the web site that you want to add the Handler Mapping to
  3. Double Click on the Handler Mappings
  4. From the actions Task Menu click on Add Managed Handler
  5. Enter the following information in the Add Managed Handler Window:
    Request Path: Reserved.ReportViewerWebControl.axd
    Type: Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
    Name: Reserved-ReportViewerWebControl-axd
  6. Click OK and you’re done.

The post Report Viewer Control not displaying data when deployed IIS7 appeared first on Aric Levin's Digital Transformation Blog.

]]>
Fixing the tp_Title field reverting to Membership Provider username https://aric.isite.dev/sharepoint/post/fixing_tp_title_field_reverting_to_membership_provider_username/ Sat, 21 Aug 2010 03:17:00 +0000 https://aric.isite.dev/index.php/2010/08/21/fixing-the-tp_title-field-reverting-to-membership-provider-username/ If you are using Forms Based Authentication for you SharePoint site you might encounter the site displaying a user name such as "i:0#.f|sql-membershipprovider|username" instead of just "username".

The post Fixing the tp_Title field reverting to Membership Provider username appeared first on Aric Levin's Digital Transformation Blog.

]]>
If you are using Forms Based Authentication for you SharePoint site you might encounter the site displaying a user name such as “i:0#.f|sql-membershipprovider|username” instead of just “username”.

You can modify the tp_Title field in your SQL Server UserInfo table directly, but at times the seems to be reverted back to its original value. There is a solution to this problem, however it is unsupported of course, as you are modifying a stored procedure in the SharePoint database directly.

If you are using Forms Based Authentication for you SharePoint site you might encounter the site displaying a user name such as “i:0#.f|sql-membershipprovider|username” instead of just “username”.

You can modify the tp_Title field in your SQL Server UserInfo table directly, but at times the seems to be reverted back to its original value. There is a solution to this problem, however it is unsupported of course, as you are modifying a stored procedure in the SharePoint database directly.

In your SharePoint database located the proc_UpdateUserInfoInTableFromRowUpdater stored procedure.

Under the declaration section of the @OldIsActive bit field add the following code:

IF (SUBSTRING(@Title, 0,3)='i:0')

    BEGIN

        DECLARE @LastIndex int

        SET @LastIndex = (SELECT CASE CHARINDEX('|', @Title)

WHEN 0 THEN -1 ELSE LEN(@Title) - CHARINDEX('|', REVERSE(@Title))+1 END)

        SET @Title = SUBSTRING(@Title, @LastIndex, LEN(@Title)-@LastIndex)    

END

You might have to modify the first line of code based on the way the tp_Title is displayed in your database. 

The post Fixing the tp_Title field reverting to Membership Provider username appeared first on Aric Levin's Digital Transformation Blog.

]]>
FBA and Friendly Display Name with SharePoint 2010 and MOSS 2007 https://aric.isite.dev/sharepoint/post/fba_and_friendly_display_name_in_sharepoint/ Thu, 08 Jul 2010 23:12:00 +0000 https://aric.isite.dev/index.php/2010/07/08/fba-and-friendly-display-name-with-sharepoint-2010-and-moss-2007/ When implementing Forms Based Authentication (FBA), SharePoint retrieves the tp_title field from the UserInfo table in the WSS_Content* database to display the name of logged in user.

The post FBA and Friendly Display Name with SharePoint 2010 and MOSS 2007 appeared first on Aric Levin's Digital Transformation Blog.

]]>
When implementing Forms Based Authentication (FBA), SharePoint retrieves the tp_title field from the UserInfo table in the WSS_Content* database to display the name of logged in user.
Depending on the method of adding the users to the SharePoint site, this login might appear with the membership provider details as part of the display name.

When implementing Forms Based Authentication (FBA), SharePoint retrieves the tp_title field from the UserInfo table in the WSS_Content* database to display the name of logged in user.

Depending on the method of adding the users to the SharePoint site, this login might appear with the membership provider details as part of the display name.

The easiest way to modify this requires a direct update to the SharePoint UserInfo table (at your own risk).

At every site collection level, SharePoint stores the name in the UserInfo table. The display name is stored in a field called tp_title. Modify that field to the display name that you want, log in to the site, and you will see the proper display name.

The post FBA and Friendly Display Name with SharePoint 2010 and MOSS 2007 appeared first on Aric Levin's Digital Transformation Blog.

]]>