During my CRM Solution import debugging yesterday, I also wanted to see which user had been logged in to CRM the last few days. After some googling og trying I came up with this SQL statement for listing all users and last time they accessed CRM during the last 3 days.
NB! You have to change the “OrgName” to get this working on you CRM database server. It is tested for CRM 2011 and CRM 2016.
USE MSCRM_Config SELECT O.FriendlyName, SU.FullName as Name, SUO.LastAccessTime FROM SystemUserOrganizations SUO LEFT JOIN SystemUserAuthentication SUA ON SUO.UserId = SUA.UserId AND LEFT(AuthInfo, 1)='C' LEFT JOIN Organization O ON SUO.OrganizationId=O.Id INNER JOIN OrgName_MSCRM.dbo.SystemUser SU ON SUO.CrmUserId = SU.systemuserid WHERE LastAccessTime IS NOT NULL AND O.FriendlyName = 'OrgName' AND datediff(DAY,Lastaccesstime, getutcdate()) < 3 ORDER BY lastaccesstime
If you have some problems with Dynamics CRM On-premise you are able to enable tracing with PowerShell. In my case, I needed to get debug information on why my solution import is failing when I’m going to move it to a new organization.
Open the powershell prompt and use the Add-PSSnapin command shown in 1). Thereby, You can list the trace setting with the command shown in 2). Before you start the tracing, you show determine the timeline for when the error occurs and just enble it as close as the error as possible. Run the command in 3) to start the tracing. You should stop the tracing immediately after the error has occured. Use command in 4) to stop the tracing.
# 1) add Add-PSSnapin Microsoft.Crm.PowerShell # 2) get crm trace settings Get-CrmSetting TraceSettings # 3) enable tracing $Setting = Get-CrmSetting TraceSettings $Setting.Enabled = $True $Setting.CallStack=$True $Setting.Categories="*:Verbose" $Setting.Directory="C:\temp\crmtrace" Set-CrmSetting $setting # 4) disbale tracing $Setting = Get-CrmSetting TraceSettings $setting.Enabled = $False Set-CrmSetting $setting
When you have tons of log file, the trace tool CRM Trace reader is nice to use for searing and filtering.
After we found out that SOTI Enterprise Mobility Management system didn’t fully support Windows 10 Store Apps in “Kiosk Mode”, we had to rewrite out latest app using WPF technologi instead.
In this process. I wanted a kind of watermark in my TextBox Controls. After some googling, I found a pretty nice library called “Extended WPF Toolkit” on codeplex (and Nuget).
How to create a watermark input textbox
- Add “Extended.Wpf.Toolkit” via Nuget
- Add XML Namespace at the top of the XAML file
- Add “xctk:WatermarkTextBox” instead of “TextBox” Control With the Watermark attribute set to the help text
<xctk:WatermarkTextBox x:Name="txtSearch" Watermark="type search pattern" />
Blogging haven’t been my first priority for the last year due different circumstances. As a result, I have decided to set a goal of at least one blog post every month. We have to learn something new every day in this industry to keep up with the changes, so it should always be something to write about🙂
So, what have I been doing for the last year (2015)?
It started with a competence boom at the KiPi 2015 (“Know It, Prove It”) Challenge at Microsoft Virtual Academy in February where I followed and completed Cloud Development, Mobile Development and Hybrid Cloud learning paths. This inspired me to look at the different Azure exams, but unfortunately, busy projects made it impossible to complete these.
Between March and New Year, I worked on mainly upgrade and migration projects for customers, and the next few blog posts will summarize my experience from these projects and describe what kind of knowledge from these projects I have put into my toolbox.
The T-SQL script below will find all tables and columns for a particular Primary Key column (located in WHERE clause [pk-table].[pk-column]). This script is pretty useful when you are working close the database, manipulating data directly and so on.
SELECT K_Table = FK.TABLE_NAME, FK_Column = CU.COLUMN_NAME, PK_Table = PK.TABLE_NAME, PK_Column = PT.COLUMN_NAME, Constraint_Name = C.CONSTRAINT_NAME FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS C INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS FK ON C.CONSTRAINT_NAME = FK.CONSTRAINT_NAME INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS PK ON C.UNIQUE_CONSTRAINT_NAME = PK.CONSTRAINT_NAME INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE CU ON C.CONSTRAINT_NAME = CU.CONSTRAINT_NAME INNER JOIN ( SELECT i1.TABLE_NAME, i2.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS i1 INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE i2 ON i1.CONSTRAINT_NAME = i2.CONSTRAINT_NAME WHERE i1.CONSTRAINT_TYPE = 'PRIMARY KEY' ) PT ON PT.TABLE_NAME = PK.TABLE_NAME WHERE PK.TABLE_NAME = '[pk-table]' AND PT.COLUMN_NAME = '[pk-column]'
Microsoft have released three Azure Specialist exams for the last few months. I have been watching a lot of videos on Microsoft Virtual Academy, Channel9 and Pluralsight the last few years, and very intensivly since December 2014.
70-532 Developing Microsoft Azure Solutions. This is a developer exam for people who wants to be able to designing, programming, implementing, automating, and monitoring Microsoft Azure solutions.
70-533 Implementing Microsoft Azure Infrastructure Solutions. This is an exam for IT-pros and solution architects who wants to implementing an infrastructure solution in Microsoft Azure. Candidates have experience implementing and monitoring cloud and hybrid solutions as well as supporting application lifecycle management.
70-534 Architecting Microsoft Azure Solutions. This is an exam for Solution Arcitects should know the features and capabilities of Azure services to be able to identify tradeoffs and make decisions for designing public and hybrid cloud solutions. Candidates who take this exam are expected to be able to define the appropriate infrastructure and platform solutions to meet the required functional, operational, and deployment requirements through the solution lifecycle.
- Channel9 – video series
Our department had an interesting challenge the last week. We have an old local on-premise “Team Foundation Server” (TFS) in our data room with two VMware hosts containing a number virtual machines. Due to new company policies we needed to move all domain bound VMs to a new domain. As we feared, this cause a few problems due to old versions and incorrect editions of different software.
The first thing I did was to perform a “Get Latest” on all source code just, in addition to a VM snapshot before we started the actual migration process. We needed to have a “Plan B” if the migration failed. After a few days with migration failures with loads of issues between Sharepoint, Project Server, SQL Server and TFS, we decided to make a clean install and move the source code into new team projects. The problem now was that the old TFS server had about 60 team projects that need to be created manually.
As a lazy programmer, I prefer a command-line utility to help me with this project creation. Luckily, 99% of all team projects didn’t use the Sharepoint site, so for the moment I just have to migrate source code to the version control of the new TFS server.
The command-line tool need for the team project creation is called “TFS Power Tools”, and exists in the latest version of Visual Studio – 2012 and 2013. Here is the command template I have used for our team projects.
tfpt createteamproject /collection:"http://[IP or Hostname]:8080/tfs/DefaultCollection" /teamproject:"[project name]" /processtemplate:"Microsoft Visual Studio Scrum 2.2" /sourcecontrol:New /noreports /noportal
Since I found the list of team project directories by using “dir /b” from the DOS-prompt and put this directory list into Excel and generated one command for each project based on the command-line above. All these command where put into a command file (cmd) and run. When this is completed I will add all files from the projects from the “Source Control Explorer” in Visual Studio. Some manual work is needed.