You've downloaded the package, unzipped it and it runs just fine... or does it!
The update button doesn't work :(
The issue is that windows has handily blocked access to the dlls the registration tool calls.
The fix is to right click each file in turn and unblock them... or you can run this nifty command in powershell and save yourself from a nasty dose of RSI.
1. Run PowerShell as an Administrator
2. Run the following command but update the path to the folder were you unzipped the tool:
get-childitem "C:\Software\Dynamics 365 Plugin Registration" | unblock-file
3. Have a pint, you've earned it :)
Dynamics 365: An Odyssey
Friday 1 November 2019
Thursday 31 October 2019
Hyper-V Backup Script...
Slightly off topic but still very relevant when developing Dynamics 365 solutions. Typically that developer sandpit we're working hard on is running in Hyper-V and whilst we're using source control to back up our work, backing up your VMs is just as important.
Here's a nifty Powershell script for Windows 10 that for each Hyper-V Virtual Machine will:
1. Save the state of the VM if it's running
2. Robocopy the VM including snapshots to a backup drive
3. Restart the VM if it was running
# ----------------------------------------------------------
# HYPER-V BACKUP SCRIPT
# ----------------------------------------------------------
$Date = Get-Date -format yyyyMMdd
# Backup HyperV
$VMs = Get-VM * | Select Name,State,Path
foreach ($VM in [array] $VMs)
{
$VmName = $VM.Name
$FolderPath = $VM.Path
$InitialState = $VM.State
"Backing up Virtual Machine: " + $VmName + " - " + $FolderPath + " - " + $InitialState
if ($InitialState -eq "Running")
{
"Saving VM........"
Save-VM $VmName
"Done!"
}
$BackupPath = "Z:\HyperV\" + $VmName
"Starting backup...."
# 4. Backup VM
Robocopy $FolderPath $BackupPath /e /mir /np /tee /mt /log+:"Z:\Logs\"$Date"_"$VmName"_Backup_Log.txt" #/XD *.vhdx
"Backup complete!"
# 5. Restart VM
if ($InitialState -eq "Running")
{
"Re-starting VM........"
Start-VM $VmName
"Done!"
}
}
Here's a nifty Powershell script for Windows 10 that for each Hyper-V Virtual Machine will:
1. Save the state of the VM if it's running
2. Robocopy the VM including snapshots to a backup drive
3. Restart the VM if it was running
# ----------------------------------------------------------
# HYPER-V BACKUP SCRIPT
# ----------------------------------------------------------
$Date = Get-Date -format yyyyMMdd
# Backup HyperV
$VMs = Get-VM * | Select Name,State,Path
foreach ($VM in [array] $VMs)
{
$VmName = $VM.Name
$FolderPath = $VM.Path
$InitialState = $VM.State
"Backing up Virtual Machine: " + $VmName + " - " + $FolderPath + " - " + $InitialState
if ($InitialState -eq "Running")
{
"Saving VM........"
Save-VM $VmName
"Done!"
}
$BackupPath = "Z:\HyperV\" + $VmName
"Starting backup...."
# 4. Backup VM
Robocopy $FolderPath $BackupPath /e /mir /np /tee /mt /log+:"Z:\Logs\"$Date"_"$VmName"_Backup_Log.txt" #/XD *.vhdx
"Backup complete!"
# 5. Restart VM
if ($InitialState -eq "Running")
{
"Re-starting VM........"
Start-VM $VmName
"Done!"
}
}
Monday 14 October 2019
Getting a D365 Record ID via Bookmark
This is a nifty trick for getting the ID of a record in D365:
1. Create a new bookmark with the following as the page:
javascript: if (window.prompt("CRM Record GUID is :", $("iframe").filter(function () { return ($(this).css("visibility") == "visible") })[0].contentWindow.Xrm.Page.data.entity.getId().slice(1, -1))) { }
2. Navigate to the record you require in D365.
3. Hit the bookmark.
1. Create a new bookmark with the following as the page:
javascript: if (window.prompt("CRM Record GUID is :", $("iframe").filter(function () { return ($(this).css("visibility") == "visible") })[0].contentWindow.Xrm.Page.data.entity.getId().slice(1, -1))) { }
2. Navigate to the record you require in D365.
3. Hit the bookmark.
Saturday 17 August 2019
Adding support for a Swagger REST API in a Dynamics Plugin
So recently I've been working on consuming a Swagger REST API in a Sandboxed plugin.
Visual Studio now includes some nice features to support this such as the ability to automatically generate an API Client and the models used by an API directly using Swagger.
You simply right click > add > REST API Client and then enter the url to the Swagger documentation (EG. http://api.yourapi.com/swagger/docs/v1).
Visual Studio generates all the classes for you.
So far so good... :)
However, before you skip happily off into the sunset there are a couple of gotchas....
1. The APIClient classes generated cannot be consumed from a Sandbox plugin - they cause security exceptions as they reference libraries not supported by plugins running in isolation mode. If you are running on prem outside the Sandbox you're good to go - if not... you'll need to call the API directly using HttpClient instead.
var content = new StringContent(json);
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
HttpClient client = new HttpClient();
HttpResponseMessage resp = await client.PostAsync($"{apiUrl}api/dostuff", content);
return resp;
2. If your Plugin consumes other libraries using ILMerge and you also reference NewtonSoft you may hit issues. This is because at the point you generate the REST API classes a nuget package reference is added to your project. In my case I already had a reference to NewtonSoft version 12. The REST API package references version 6. This works fine - the version difference is handled and the classes are generated correctly. However, when I attempt to compile after the operation has completed, ILMerge barfs and the build fails.
The fix is to delete the nuget package and recompile. Everything works again... but you've lost a couple of hours of life and that right click, add REST API Client has lost some of it's sparkle ;)
Visual Studio now includes some nice features to support this such as the ability to automatically generate an API Client and the models used by an API directly using Swagger.
You simply right click > add > REST API Client and then enter the url to the Swagger documentation (EG. http://api.yourapi.com/swagger/docs/v1).
Visual Studio generates all the classes for you.
So far so good... :)
However, before you skip happily off into the sunset there are a couple of gotchas....
1. The APIClient classes generated cannot be consumed from a Sandbox plugin - they cause security exceptions as they reference libraries not supported by plugins running in isolation mode. If you are running on prem outside the Sandbox you're good to go - if not... you'll need to call the API directly using HttpClient instead.
var content = new StringContent(json);
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
HttpClient client = new HttpClient();
HttpResponseMessage resp = await client.PostAsync($"{apiUrl}api/dostuff", content);
return resp;
2. If your Plugin consumes other libraries using ILMerge and you also reference NewtonSoft you may hit issues. This is because at the point you generate the REST API classes a nuget package reference is added to your project. In my case I already had a reference to NewtonSoft version 12. The REST API package references version 6. This works fine - the version difference is handled and the classes are generated correctly. However, when I attempt to compile after the operation has completed, ILMerge barfs and the build fails.
The fix is to delete the nuget package and recompile. Everything works again... but you've lost a couple of hours of life and that right click, add REST API Client has lost some of it's sparkle ;)
Thursday 2 May 2019
ILMerge for Dynamics 365
Once upon a time implementing ILMerge for your plugin assemblies was a royal PITA...not anymore.
This makes it soooo easy :)
https://community.dynamics.com/crm/b/nishantranaweblog/archive/2017/05/17/using-ilmerge-for-plugin-in-crm
This makes it soooo easy :)
https://community.dynamics.com/crm/b/nishantranaweblog/archive/2017/05/17/using-ilmerge-for-plugin-in-crm
Sunday 23 February 2014
Enterprise Development Part 2 - ALM Framework for Dynamics CRM
The ALM Framework for Dynamics CRM is a C# console application designed to support an automated scheduled build. It's primary purpose is to enable the following process to be executed from a command line or windows schedule:
1. Get the latest version of the visual studio web resources project (VSProj1) from Team Foundation Server (TFS).
2. Deploy the web resources to the "master" CRM Server and publish all.
3. Increment the version of the target CRM solution on the master server.
4. Export the target CRM solution as a zip file to a temporary location on disk.
5. Check out the workspace folder containing the unpacked/extracted CRM solution files in the local workspace (VSProj2).
6. Unpack the CRM solution exported from the master server to disk - overwriting the existing files in VSProj2.
7. Modify the Visual Studio project that contains the unpacked solution files (VSProj2) - new files are added to the .csproj and deleted files are removed.
8. Check in the TFS workspace.
2. You are using an existing CRM Developer SDK Visual Studio project to source control and develop your web resources (VSProj1).
3. You have an existing Visual Studio project to store the unpacked CRM solution for source control purposes. (VSProj2)
Build Automation for Dynamics CRM provides the glue that binds these three components together to support the process outlined for source controlling a CRM solution.
1. Get the latest version of the visual studio web resources project (VSProj1) from Team Foundation Server (TFS).
2. Deploy the web resources to the "master" CRM Server and publish all.
3. Increment the version of the target CRM solution on the master server.
4. Export the target CRM solution as a zip file to a temporary location on disk.
5. Check out the workspace folder containing the unpacked/extracted CRM solution files in the local workspace (VSProj2).
6. Unpack the CRM solution exported from the master server to disk - overwriting the existing files in VSProj2.
7. Modify the Visual Studio project that contains the unpacked solution files (VSProj2) - new files are added to the .csproj and deleted files are removed.
8. Check in the TFS workspace.
Pre-Reqs and Assumptions
1. You are using Team Foundation Server for Source Control.2. You are using an existing CRM Developer SDK Visual Studio project to source control and develop your web resources (VSProj1).
3. You have an existing Visual Studio project to store the unpacked CRM solution for source control purposes. (VSProj2)
Under the Hood
The Build Automation tool is simply a mechanism for automating calls to a number of other tools that actually do the work. They are as follows:TF.exe
This is a commandline tool for interacting with and automating Team Foundation Server operations. Build Automation calls TF.exe to "Get Latest", "Checkout", "Checkin", "Delete" and "Label" the CRM Solution files.Solution Packager
This is a command line tool included in the CRM SDK. It is used for unpacking CRM solutions to disk as individual files so that they can then be source controlled. It is also used for packaging the individual files back into a zip file CRM solution that can be imported into a target CRM server.Solution Manager
A command line tool to provide xRM solution management. This tools is used to export, import and version CRM solutions. In the process above it exports the CRM solution file that is then unpacked by the Solution Packager.Build Automation for Dynamics CRM provides the glue that binds these three components together to support the process outlined for source controlling a CRM solution.
Wednesday 29 January 2014
Enterprise Development Part 1 - Source Controlling CRM Solutions in the Real World
So here is the first in a series of posts regarding Enterprise Development and Dynamics CRM. Our first requirement was to automate the source controlling of our CRM customisations. However, before I get stuck into the detail, a bit of context:
1. Our development is based on the following:
2. Our development environment is split between a data centre and office network. Although there is a VPN connecting them there is no routing from the data centre network to the office network. This will become a critical factor when determining our approach.
3. We have a single instance of Dynamics CRM that we treat as our Master for CRM customisations and this server is hosted on the office network. The TFS server is hosted on the data centre network.
4. Developers work on standalone instances of CRM and can work on all aspects of CRM including customisations. However, as there is a single server acting as the master for customisations any changes to anything but web resources must be made either directly on this server or exported as unmanaged customisations and imported into this server.
NB. We generally make these changes directly on the server as exporting and importing can prove problematic when multiple developers make changes to the same entity.
A collection of powershell cmdlets for supporting Continuous Integration and automated deployments. Powerful but there's a licensing cost and you'll need a reasonable understanding of powershell. There's a bit of a learning curve in getting your head round the framework as well. Unfortunately in our case the cost proved prohibitive and so we went with a different approach....
A console application for automating the extraction, source control, packaging and deployment of Dynamics CRM solutions. It's a lightweight and open source approach to build automation. You can find it here: http://crmbuildautomation.codeplex.com.
1. Our development is based on the following:
- Dynamics CRM 2011
- Visual Studio 2010
- Team Foundation Server 2012
2. Our development environment is split between a data centre and office network. Although there is a VPN connecting them there is no routing from the data centre network to the office network. This will become a critical factor when determining our approach.
3. We have a single instance of Dynamics CRM that we treat as our Master for CRM customisations and this server is hosted on the office network. The TFS server is hosted on the data centre network.
4. Developers work on standalone instances of CRM and can work on all aspects of CRM including customisations. However, as there is a single server acting as the master for customisations any changes to anything but web resources must be made either directly on this server or exported as unmanaged customisations and imported into this server.
NB. We generally make these changes directly on the server as exporting and importing can prove problematic when multiple developers make changes to the same entity.
ALM Frameworks for Dynamics CRM
Considering Dynamics CRM is itching to increase it's footprint in the Enterprise sector the support for Enterprise Level development within the product is a bit limited. However, there are a some existing frameworks and some open source tools that step into the breach:
Visual Studio xRM CI Framework
A great suite of components including a custom TFS Build template definition that supports exporting and source controlling your customisations. In almost every scenario this would be answer....
However, in our case this was not suitable as the TFS Build Agent requires a network port accepting inbound connections (by default this is 9191) from the build server and the network restrictions meant our TFS server could not communicate with our build agent. Furthermore if your build agent is on a different domain this can also cause complications but there is at least a workaround for this issue here.
For those development environments that aren't constrained by the network restrictions we encountered though this would be the recommended approach and you can find it here.
However, in our case this was not suitable as the TFS Build Agent requires a network port accepting inbound connections (by default this is 9191) from the build server and the network restrictions meant our TFS server could not communicate with our build agent. Furthermore if your build agent is on a different domain this can also cause complications but there is at least a workaround for this issue here.
For those development environments that aren't constrained by the network restrictions we encountered though this would be the recommended approach and you can find it here.
Adx Studio ALM for Dynamics CRM
A collection of powershell cmdlets for supporting Continuous Integration and automated deployments. Powerful but there's a licensing cost and you'll need a reasonable understanding of powershell. There's a bit of a learning curve in getting your head round the framework as well. Unfortunately in our case the cost proved prohibitive and so we went with a different approach....
ALM Framework for Dynamics CRM
Subscribe to:
Posts (Atom)