Almost two weeks in…

Well, I’m almost two weeks into my new job. It feels good to be back in a school again. It feels even better to be back in a job with a wider range of responsibilities. It feels brilliant to be in a job where I feel like I actually have something to do. 

My new boss seems like a good guy so far. He’s very happy to listen to my thoughts and ideas and he is encouraging me to look at the way things are and to suggest improvements. He is quite new in the position too, he’s only been there for about three months. I suspect that if I’d been invited for interview in the first round, I’d probably have started at around the same time as him. 

There is so much to do. With all due respect to the people looking after the network there before me, I think there have been several questionable design choices. Some of the security policies are downright scary. 

The first thing that needs to be done is to install the latest version of Smoothwall. Unfortunately, the virtual farms that the current instances are stored on are too old to install the latest version. This means that either we have to install some Smoothwall appliances or we need to update the virtual farms. I’m in the process of getting pricing for both options. 

Other things I’ve been doing are:

  • Enabling deduplication on one of their shared network drives. More than 350GB of savings on 1.2TB of data!
  • Attempting to wrap my head around the instance of Veeam they have backing up their virtual farms. 
  • Installing Lansweeper to get an idea of what hardware and software we have in the place. 
  • Installing a KMS server. Seriously, more than 1200 machines using MAK keys is nuts!
  • Installing PasswordState, a locally installed password management system similar to LastPass and Dashlane. 
  • Installing some RADIUS servers to handle wireless authentication. 
  • Planning to deploy WSUS to manage updates on servers. 

It may sound hyperbolic but I feel I’ve done more in two weeks there than I did in an entire year at Westminster. I hope it continues like this. I’m sure it will. 

By the way, as and when I get the time, I’m working on a new article about the Parallels Mac Management product for ConfigMgr. There have been some pretty big updates for it in the last year and one of my contacts at Parallels very kindly let me download it to have a look. Suffice to say, I’m very impressed with what I see there. 

SCOM – SQL Management Pack Configuration

SCOM is a bastard of a product, isn’t it? It’s even more so¬†when you’re trying to monitor a SQL instance or two. It’s also quite amusing that Chrome doesn’t recognise SCOM as a word in its dictionary and that it suggests SCAM as a possible word ūüôā

My major project at work for the past few months has been SCOM. I am monitoring about 300 Windows VMs, about a third of which have a SQL database instances on them. I’ve kept with using the LocalSystem account as the SCOM action account and for the majority of the time, that’s enough. However, there have been a few times where it hasn’t been enough. It’s always a permissions issue, the LocalSystem account doesn’t have access to one or more of the databases so the discovery and monitoring scripts can’t run and you get a myriad of alerts.

When it comes to adding a management pack into SCOM,¬†always read the damn documentation that comes with the MP. I know it’s tedious but it’s necessary. Reading the documentation for the SQL Management pack found at Microsoft’s website¬†gives you some interesting recommendations. They suggest that you have three action accounts for SQL:

  1. A discovery account
  2. A default action account
  3. A monitoring account

They also recommend that you put the monitoring and discovery account into an additional AD group. Once you do that, you have to add the users to SQL, assign them specific permissions to databases, give them access to parts of the Windows registry, assign them permissions to various WMI namespaces, grant them local logon privileges and more. I’m not going to go over the whole process, if you really want to see it look at Microsoft’s documentation.

The point is, it’s a lot of work. Wouldn’t it be nice if we could automate it? Well, I’ve written a script that does precisely that. It’s a big one:

Function Set-WmiNamespaceSecurity
Param ( [parameter(Mandatory=$true,Position=0)][string] $namespace,
[parameter(Mandatory=$true,Position=1)][string] $operation,
[parameter(Mandatory=$true,Position=2)][string] $account,
[parameter(Position=3)][string[]] $permissions = $null,
[bool] $allowInherit = $false,
[bool] $deny = $false,
[string] $computerName = ".",
[System.Management.Automation.PSCredential] $credential = $null)

Process {
#$ErrorActionPreference = "Stop"

Function Get-AccessMaskFromPermission($permissions) {
$READ_CONTROL = 0x20000
$WRITE_DAC = 0x40000

$WBEM_RIGHTS_STRINGS = "Enable","MethodExecute","FullWrite","PartialWrite",`

$permissionTable = @{}

for ($i = 0; $i -lt $WBEM_RIGHTS_FLAGS.Length; $i++) {
$permissionTable.Add($WBEM_RIGHTS_STRINGS[$i].ToLower(), $WBEM_RIGHTS_FLAGS[$i])

$accessMask = 0

foreach ($permission in $permissions) {
if (-not $permissionTable.ContainsKey($permission.ToLower())) {
throw "Unknown permission: $permission`nValid permissions: $($permissionTable.Keys)"
$accessMask += $permissionTable[$permission.ToLower()]


if ($PSBoundParameters.ContainsKey("Credential")) {
$remoteparams = @{ComputerName=$computer;Credential=$credential}
} else {
$remoteparams = @{ComputerName=$computerName}

$invokeparams = @{Namespace=$namespace;Path="__systemsecurity=@"} + $remoteParams

$output = Invoke-WmiMethod @invokeparams -Name GetSecurityDescriptor
if ($output.ReturnValue -ne 0) {
throw "GetSecurityDescriptor failed: $($output.ReturnValue)"

$acl = $output.Descriptor

$computerName = (Get-WmiObject @remoteparams Win32_ComputerSystem).Name

if ($account.Contains('\')) {
$domainaccount = $account.Split('\')
$domain = $domainaccount[0]
if (($domain -eq ".") -or ($domain -eq "BUILTIN")) {
$domain = $computerName
$accountname = $domainaccount[1]
} elseif ($account.Contains('@')) {
$domainaccount = $account.Split('@')
$domain = $domainaccount[1].Split('.')[0]
$accountname = $domainaccount[0]
} else {
$domain = $computerName
$accountname = $account

$getparams = @{Class="Win32_Account";Filter="Domain='$domain' and Name='$accountname'"}

$win32account = Get-WmiObject @getparams

if ($win32account -eq $null) {
throw "Account was not found: $account"

switch ($operation) {
"add" {
if ($permissions -eq $null) {
throw "-Permissions must be specified for an add operation"
$accessMask = Get-AccessMaskFromPermission($permissions)

$ace = (New-Object System.Management.ManagementClass("win32_Ace")).CreateInstance()
$ace.AccessMask = $accessMask
if ($allowInherit) {
} else {
$ace.AceFlags = 0

$trustee = (New-Object System.Management.ManagementClass("win32_Trustee")).CreateInstance()
$trustee.SidString = $win32account.Sid
$ace.Trustee = $trustee


if ($deny) {
} else {

$acl.DACL += $ace.psobject.immediateBaseObject

"delete" {
if ($permissions -ne $null) {
throw "Permissions cannot be specified for a delete operation"

[System.Management.ManagementBaseObject[]]$newDACL = @()
foreach ($ace in $acl.DACL) {
if ($ace.Trustee.SidString -ne $win32account.Sid) {
$newDACL += $ace.psobject.immediateBaseObject

$acl.DACL = $newDACL.psobject.immediateBaseObject

default {
throw "Unknown operation: $operation`nAllowed operations: add delete"

$setparams = @{Name="SetSecurityDescriptor";ArgumentList=$acl.psobject.immediateBaseObject} + $invokeParams

$output = Invoke-WmiMethod @setparams
if ($output.ReturnValue -ne 0) {
throw "SetSecurityDescriptor failed: $($output.ReturnValue)"

Function Add-DomainUserToLocalGroup
$de = [ADSI]‚ÄúWinNT://$computer/$Group,group‚ÄĚ
} #end function Add-DomainUserToLocalGroup

Function Add-UserToLocalLogon
$tmp = [System.IO.Path]::GetTempFileName()
secedit.exe /export /cfg "$($tmp)"
$c = Get-Content -Path $tmp
$currentSetting = ""

foreach($s in $c) {
if( $s -like "SeInteractiveLogonRight*") {
$x = $s.split("=",[System.StringSplitOptions]::RemoveEmptyEntries)
$currentSetting = $x[1].Trim()

if( $currentSetting -notlike "*$($UserSID)*" ) {
if( [string]::IsNullOrEmpty($currentSetting) ) {
$currentSetting = "*$($UserSID)"
} else {
$currentSetting = "*$($UserSID),$($currentSetting)"

$outfile = @"
[Privilege Rights]
SeInteractiveLogonRight = $($currentSetting)

$tmp2 = [System.IO.Path]::GetTempFileName()

$outfile | Set-Content -Path $tmp2 -Encoding Unicode -Force

Push-Location (Split-Path $tmp2)

try {
secedit.exe /configure /db "secedit.sdb" /cfg "$($tmp2)" /areas USER_RIGHTS

} finally {

#Set Global Variables

$Default_Action_Account = "om_aa_sql_da"
$Discovery_Action_Account = "om_aa_sql_disc"
$Monitoring_Action_Account = "om_aa_sql_mon"
$LowPrivGroup = "SQLMPLowPriv"

$WindowsDomain = "Intranet"
#Add users to local groups

Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Performance Monitor Users" -user $Monitoring_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Performance Monitor Users" -user $Default_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Event Log Readers" -user $Monitoring_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Event Log Readers" -user $Default_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Users" -user $LowPrivGroup -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Users" -user $Default_Action_Account -domain $WindowsDomain
#AD SIDs for Default Action Account user and Low Priv group - required for adding users to local groups and for service security settings.

#Define SIDs for Default Action and Low Priv group. To get a SID, use the following command:
#Get-ADUser -identity [user] | select SID
#Get-ADGroup -identity [group] | select SID
#Those commands are part of the AD management pack which is why they're not in this script, I can't assume that this script is being run on a DC or on
#a machine with the AD management shell installed

$SQLDASID = "S-1-5-21-949506055-860247811-1542849698-1419242"
$SQLMPLowPrivsid = "S-1-5-21-949506055-860247811-1542849698-1419239"

Add-UserToLocalLogon -UserSID $SQLDASID
Add-UserToLocalLogon -UserSID $SQLMPLowPrivsid

#Set WMI Namespace Security

Set-WmiNamespaceSecurity root add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\cimv2 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\default add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement10'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement10 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity }
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement11'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement11 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity }

Set-WmiNamespaceSecurity root add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\cimv2 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\default add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement10'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement10 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity }
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement11'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement11 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity }

#Set Registry Permissions

$acl = Get-Acl 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($Default_Action_Account)","readkey","ContainerInherit","None","Allow")
$acl | Set-Acl -Path 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$acl = $null
$acl = Get-Acl 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($LowPrivGroup)","readkey","ContainerInherit","None","Allow")
$acl | Set-Acl -Path 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$acl = $null

$SQLInstances = Get-ChildItem 'registry::hklm\SOFTWARE\Microsoft\Microsoft SQL Server' | ForEach-Object {Get-ItemProperty $_.pspath } | Where-Object {$_.pspath -like "*MSSQL1*" }

$SQLInstances | Foreach {
$acl = Get-Acl "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($LowPrivGroup)","readkey","ContainerInherit","None","Allow")
$acl | Set-Acl -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$acl = $null

$acl = Get-Acl "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($Default_Action_Account)","readkey","ContainerInherit","None","Allow")
$acl | Set-Acl -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$acl = $null


#Set SQL Permissions

#Get SQL Version
if ($SQLInstances.Count -eq $null) {

$version = Get-ItemProperty "registry::HKLM\Software\Microsoft\Microsoft SQL Server\$($SQLInstances.PSChildName)\MSSQLSERVER\CurrentVersion"

} else {

$version = Get-ItemProperty "registry::HKLM\Software\Microsoft\Microsoft SQL Server\$($SQLInstances[0].PSChildName)\MSSQLSERVER\CurrentVersion"

#Import appropriate SQL PowerShell module

if ($version.CurrentVersion -ge 11) {
#Import SQL 2012 Module
Import-Module sqlps
#change out of sql context
} else {
#Add SQL 2008 Snap-in
Add-PSSnapin SqlServerCmdletSnapin100
Add-PSSnapin SqlServerProviderSnapin100

#Create database users and assign permissions

$CreateDatabaseUsers = "use master

create login [$($WindowsDomain)\$($LowPrivGroup)] from windows

grant view server state to [$($WindowsDomain)\$($LowPrivGroup)]
grant view any definition to [$($WindowsDomain)\$($LowPrivGroup)]
grant view any database to [$($WindowsDomain)\$($LowPrivGroup)]
grant select on sys.database_mirroring_witnesses to [$($WindowsDomain)\$($LowPrivGroup)]

create login [$($WindowsDomain)\$($Default_Action_Account)] from windows

grant view server state to [$($WindowsDomain)\$($Default_Action_Account)]
grant view any definition to [$($WindowsDomain)\$($Default_Action_Account)]
grant view any database to [$($WindowsDomain)\$($Default_Action_Account)]
grant alter any database to [$($WindowsDomain)\$($Default_Action_Account)]
grant select on sys.database_mirroring_witnesses to [$($WindowsDomain)\$($Default_Action_Account)]

#Generate query to assign users and permissions to databases
$DatabaseUsers1 = "SELECT 'use ' + name + ' ;'
+ char(13) + char(10)
+ 'create user [$($WindowsDomain)\$($LowPrivGroup)] FROM login [$($WindowsDomain)\$($LowPrivGroup)];'
+ char(13) + char(10) + 'go' + char(13) + char(10)
FROM sys.databases WHERE database_id = 1 OR database_id >= 3
SELECT 'use msdb; exec sp_addrolemember @rolename=''SQLAgentReaderRole'', @membername=''$($WindowsDomain)\$($LowPrivGroup)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
SELECT 'use msdb; exec sp_addrolemember @rolename=''PolicyAdministratorRole'', @membername=''$($WindowsDomain)\$($LowPrivGroup)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)

$DatabaseUsers2 = "SELECT 'use ' + name + ' ;'
+ char(13) + char(10)
+ 'create user [$($WindowsDomain)\$($Default_Action_Account)] FROM login [$($WindowsDomain)\$($Default_Action_Account)];'
+ 'exec sp_addrolemember @rolename=''db_owner'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'';'
+ 'grant alter to [$($WindowsDomain)\$($Default_Action_Account)];'
+ char(13) + char(10) + 'go' + char(13) + char(10)
FROM sys.databases WHERE database_id = 1 OR database_id >= 3
SELECT 'use msdb; exec sp_addrolemember @rolename=''SQLAgentReaderRole'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
SELECT 'use msdb; exec sp_addrolemember @rolename=''PolicyAdministratorRole'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)

$SQLInstances | Foreach {
if ($_.PSChildName.split('.')[-1] -eq "MSSQLSERVER") {
$InstanceName = $env:COMPUTERNAME
} else {
$InstanceName = "$($env:COMPUTERNAME)\$($_.PSChildName.split('.')[-1])" }

Invoke-Sqlcmd -ServerInstance $InstanceName $CreateDatabaseUsers
$Provision1 = Invoke-Sqlcmd -ServerInstance $InstanceName $DatabaseUsers1
$Provision2 = Invoke-Sqlcmd -ServerInstance $InstanceName $DatabaseUsers2

$Provision1 | foreach {
Invoke-Sqlcmd -ServerInstance $InstanceName $_.ItemArray[0]
$Provision2 | foreach {
Invoke-Sqlcmd -ServerInstance $InstanceName $_.ItemArray[0]

#Grant Default Action account rights to start and stop SQL Services

$SQLServices = Get-Service -DisplayName "*SQL*"

$SQLServices | Foreach {
& c:\windows\system32\sc.exe sdset $_.Name D`:`(A`;`;GRRPWP`;`;`;$($SQLDASID)`)`(A`;`;CCLCSWRPWPDTLOCRRC`;`;`;SY`)`(A`;`;CCDCLCSWRPWPDTLOCRSDRCWDWO`;`;`;BA`)`(A`;`;CCLCSWLOCRRC`;`;`;IU`)`(A`;`;CCLCSWLOCRRC`;`;`;SU`)`S`:`(AU`;FA`;CCDCLCSWRPWPDTLOCRSDRCWDWO`;`;`;WD`)

There are huge swathes of this script that I can not take credit for, mostly the functions.

The SetWMINameSpaceSecurity function was pilfered directly from here:¬† I got it from Palo Alto’s website but it appears to have been written by Microsoft themselves

The Add-DomainUserToLocalGroup function was stolen from the Hey, Scripting Guy! Blog, found here:

The Add-UserToLocalLogon function was lifted wholesale from here:

The rest, however, is all mine which you can probably tell from the quality of the code. You will need to change some variables from line 223 to match your environment. That said, it works and that’s all I care about. Enjoy!

Sigh, sometimes, software can be too bloody clever for its own good. The Code Block module that I’m using isn’t doing a very good job of formatting this script and it’s replaced some characters such as &, < and > with their HTML equivalents. I¬†think I’ve weeded them all out but I may not have. If not, let me know.

OMG! It’s been a year!

I have really been neglecting this blog. Considering that I’m paying to have the damn thing hosted, this can’t stand.

So what has happened in the last year? Quite a lot really.

First of all, the new job mentioned in the last post. If I’m honest, I’m not happy in that job. I’m not going to go too far into specifics. Suffice to say, the University of Westminster is a good place to work; they are very generous towards their staff, I am well paid there, I get a ludicrous amount of leave and the benefits package is very good. However, the job isn’t for me. I feel far too pigeon-holed. I miss the depth of work from my old job, the amount of responsibility that I had. I don’t like being third-line only very much. I don’t like being sat at my desk all day, every day. I don’t like not being able to get my hands on hardware occasionally. Perhaps unbelievably, I even miss the small amount of interaction that I had with users. I’ve been keeping my eyes open for new jobs and I saw a new one about two months ago. In fact, I even applied twice; the first time around I didn’t get through to the interview stage. It seems that they didn’t recruit in the first run so they re-advertised. I got some feedback from the man who is in charge of recruitment and I applied again. The second time, I got invited to interview.

I went to the interview. I thought that I had messed it up entirely. The first part of the interview was the technical part. I was expecting a test along the lines of almost every other technical test I’ve taken, i.e. “What is a FSMO Role?”, “Why can’t this printer print?”, “Who do you prioritise, the headmaster or a classroom that can’t work?”, you know the sort of thing. Instead, they gave me a series of scenarios and twenty minutes to put some thoughts down on them. While obviously they were interested in my technical skill set, they were more interested in how I approach problems and how my thought processes worked. When they were analysing my answers, there was one question that they asked which I really made a pig’s ear of and which I couldn’t answer. A very awkward silence ensued while I desperately tried to understand what they wanted from me but in the end, they put me out of my misery. Truth be told, I was almost ready to walk away at that point.

The panel interview came after with the ICT Manager and the Head of Performance and Recruitment. That went a little better although I did give a very arrogant answer to a question: They asked if I thought I could do the job and I said that I wouldn’t be sitting there if I didn’t. I cringed almost immediately. Argh.

So anyway, I couldn’t have done too badly as I got the job. On 1st August 2016 I shall be starting work for the Haberdashers’ Aske’s multi-academy trust in New Cross, London. My job title will be IT Systems Administrator. It’s closer to home, it’s more money, there is going to be a lot more variety and responsibility and it’s in an environment that I think I’ll be much more comfortable in. I’m a lot more optimistic about this job than I was about Westminster and I’m looking forward to starting tremendously.

So what else has happened in the last year? Well, I changed the hosting provider of this site. I originally bought space from GoDaddy as it was cheap for the first year. However, subsequent years were stupidly expensive so I said “Bugger that” and changed host. My site is now hosted by SGIS Hosting who are a lot more reasonable and migrated this site to their servers for me. They give you less space and bandwidth than GoDaddy but I have more than enough for my purposes and I have some space with which I can mess about with outside of WordPress if I want to. The only major disadvantage is that I now have to keep WordPress up to date manually.

More significantly, my girlfriend and I have moved away from Hertfordshire. This made me sad, I really loved it in Harpenden and I miss the place a lot. However, it was necessary for reasons that will probably become clear later on in this article. We moved into a nice flat in Beckenham. The rent is a lot more but travel costs are considerably less so it more or less evens out. With this new job, travel costs will be lower still.

The last major thing that has happened is that I have become a father! My son was born on the 25th May 2016 at 10.47. The preceding month and the first week of his life was by far the most stressful time of my life! Towards the end of my girlfriend’s pregnancy, there were complications and he ended up being born early and very small. He was taken to the Special Baby Care Unit (SCBU) at our local hospital where he was looked after for about a week. He was in an incubator for about two days with a glucose drip in his hand. After that, the drip came out and he was put into a cot. In the end, all they were doing was feeding him so they decided that it would be best to send him home. The birth was also very hard on my girlfriend, she ended up staying in the hospital for as long as our son.

If there was ever a cause worth donating money to, it is a SCBU. If you’re reading this and are feeling generous, please feel free to have a look at the Princess Royal University Hospital SCBU Fund’s JustGiving page and chuck a few quid their way. If you don’t want to donate to my local one, please look one up closer to you and donate to them instead. They all (not just the PRUH’s, all of them everywhere) do wonderful work under extremely difficult circumstances and they all deserve far more support than they get.

Anyway, our son is the primary reason we have moved to Beckenham. My girlfriend’s family is from around here and her sister lives nearby. My girlfriend wanted that support network close to her for when our baby arrived. I understand that and support it so here we are! On the whole, it’s a good thing as my girlfriend and our son are both getting much better care from the hospitals down here. In addition, I wouldn’t have been able to get the job in New Cross if we still lived in Harpenden so I’d probably still be stuck as Westminster for the foreseeable future.

So in summary, on a professional level, the last year has been pretty mediocre. On a personal level, despite the stresses and heartache, it’s been awesome. Once again, I toast the future! I’m looking forward to it once again!

The Future

I have a new job.

As of the 1st September, I am going to be working for a university in central London as a “Windows Specialist”. If I’m entirely honest, I’m not entirely sure what my day to day duties are going to be but I have inferred that it’s going to involve helping to migrate from Novell eDirectory to Active Directory, some SQL Server stuff and Commvault.

I have spent almost eight years in my current job. I am happy in it and I wasn’t looking to move on. However, sometimes you see an opportunity and you just have to grab it. I’m going to moving onto a network that’s going to be spanning a large part of London. They have¬†multiple campuses. Their IT has separate teams for Windows, Unix, Infrastructure and Desktop. It’s going to be second and third line mostly I think, the amount of interaction that I have with users is going to be less than what it is at the moment. No more desktop support! It is, probably literally, an order of magnitude bigger than anything I’ve ever done before and I’m simultaneously excited and completely bricking it at the same time.

Perhaps unusually, I asked to extend my notice period. I wanted to work one final summer at the college and get my projects finished and loose ends tied up. They are in the final planning stages now and I’ll be putting them in place in a weeks time.¬†Additionally, I wanted to get some proper handover documentation written too. So far, the document is more than 8000 words long and there’s plenty more to do. It’s a shame I couldn’t have met my successor to hand over to them in person but that’s the way things go sometimes.

The other thing that this extended notice period has done for me is given me a chance to get my head around the idea of leaving where I am and moving on. The difference between moving on this time and the last time is that the last time, I was desperate to go. This time around, I’m upset to be leaving and I’m still a little¬†worried that I’m moving on before I’m ready to go. Don’t get me wrong, I know that I’m capable of doing the job, that’s not my concern. My concern is that I’ve been happy where I am and more than a bit settled and that¬†moving on is going to be an upheaval.

Anyway, the end of term has come and I was one of 16 members of staff leaving this summer. I was mentioned in the principal’s end of year speech and he said some extremely kind words, comparing me to Scotty in Star Trek saying that I worked in the background, quietly and methodically keeping things going and fixing them when they blew up. He also said I’d be incredibly hard to replace which is always nice to hear.

Anyway, to the future! I’m looking forward to it.

Building, Deploying and Automatically Configuring a Mac Image using SCCM and Parallels SCCM Agent

I touched briefly on using the Parallels Management Agent to build Macs in my overview article but I thought it might be a good idea to go through the entire process that I use when¬†I have to create an image for a¬†Mac, getting the image¬†deployed and getting the Mac configured once the image is on there. At the moment, it’s not a simple process. It requires the use of several tools and, if you want the process to be completely automated, a some¬†Bash scripting as well. The process isn’t as smooth as you would get from solutions like DeployStudio but it works and, in my opinion anyway, it works well enough for you not to have to bother with a separate product for OSD. Parallels are working hard on this part of the product and they tell me that proper task sequencing will be part of V4 of the agent. As much as I’m looking forward to that, it doesn’t change the fact that right¬†now we’re on v3.5 and we have to use the messy process!

First of all, I should say that this is my method of doing it and mine alone. This is not Parallel’s method of doing this, it has not been sanctioned or condoned by them. There are some dangerous elements to it, you follow this procedure at your own risk and I will not be held responsible for damage caused by it if you try this out.


You will need the following tools:

  • A Mac running OS X Server. The server needs to be set up as a Profile Manager server, an Open Directory server and, optionally, as a Netboot server. It is also needed on Yosemite for the System Image Utility.
  • A second Mac running the client version of OS X.
  • Both the server and the client need to be running the same version of OS X (Mavericks, Yosemite, whatever) and they need to be patched to the same level. Both Macs need to have either FireWire or Thunderbolt ports.
  • A FireWire or Thunderbolt cable to connect the two Macs together.
  • A SCCM infrastructure with the Parallels SCCM Mac Management Proxy and Netboot server installed.
  • This is optional but I recommend it anyway: ¬†A copy of Xcode or another code editor¬†to create your shell scripts in. I know you could just use TextEdit but I prefer something that has proper syntax highlighting and Xcode is at least free.
  • Patience. Lots of patience. You’ll need it. The process is¬†time consuming and and can be infuriating when you get something wrong.

At the end of this process, you will have an OS X Image which can be deployed to your Macs. The image will automatically name its target,¬†it will download, install and configure the Parallels SCCM agent, join itself to your Active Directory domain, attach itself to a managed wireless network and it will install any additional software that’s not in your base image. The Mac will do this without any user interaction apart from initiating the build process.

Process Overview

The overview of the process is as follows:

  1. Create an OS X profile to join your Mac to your wireless network.
  2. Create a base installation of OS X with the required software and settings.
  3. Create a Automator workflow to deploy the Parallels agent and to do other minor configuration jobs.
  4. Use the System Image Utility to create the image and a workflow to automatically configure the disk layout and computer name.
  5. (Optional) Use the Mac OS X Netboot server to deploy the image to a Mac. This is to make sure that your workflow works and that you’ve got your post-install configuration scripts right before you add the image to your¬†ConfigMgr server. You don’t have to do this but you may find it saves you a lot of time.
  6. Convert the image to a WIM file and add it to your SCCM OSD image library
  7. Advertise the image to your Macs

I’m going to assume that you already have your SCCM infrastructure, Parallels SCCM management proxy, Parallels Netboot server and OS X Server working.

Generate an OS X Profile.

Open a browser and go to the address of your Profile Manager (usually https://{hostname.domain}/profilemanager) and go the Device Groups section.¬†I prefer to generate a profile for each major setting that I’m pushing down. It makes for a little more work getting it set up but if one of your settings breaks something, it makes it easier to¬†troubleshoot as you can remove a specific setting instead of the whole lot at once.

Your profile manager will look something like this:


As you can see, I’ve already set up some profiles but I will walk through the process for creating a profile to join your Mac to a wireless network. First of all, create a new device group by pressing the + button in the middle pane. You will be prompted to give the group a name, do so.

Untitled 2

Go to the Settings tab and press the Edit button

Untitled 3

In the General section, change the download type to Manual, put a description in the description field and under the Security section, change the profile removal section to “With Authorisation”. Put a password in the box that appears. Type it in carefully, there is no confirm box.

Untitled 4

If you are using a wireless network which requires certificates, scroll down to the certificates section and copy your certificates into there by dragging and dropping them. If you have an on-site CA, you may as well put the root trust certificate for that in there as well.

Untitled 5

Go to the Networks section and set put in the settings for your network

Untitled 6

When you’re done, press the OK button. You’ll go back to the main Profile Manager screen. Make sure you press the Save button.

I would strongly suggest that you explore Profile Manager and create profiles for other settings as well. For example,¬†you could create one to control your Mac’s energy saving settings or to set up options for your users desktop.

When you’re back on the profile manager window, press the Download button and copy the resulting .mobileconfig file to a suitable network share.

Go to a PC with the SCCM console and the PMA plugin installed. Open the Assets and Compliance workspace. Go to Compliance Settings then Configuration Items. Optionally, if you haven’t already, create a folder for Mac profiles. Right click on your folder or on Configuration Items, go to Create Parallels Configuration Item then Mac OS X Configuration Profile from File.


Give the profile a name and description, change the profile type to System then press the Browse button and browse to the network share where you copied the .mobileconfig file. Double click on the mobileconfig file then press the OK button. You then need to go to the Baselines section and create a baseline with your configuration item in. Deploy the baseline to an appropriate collection.

Create an image

On the Mac which doesn’t have OS X Server installed, install your¬†software. Create any additional local¬†users accounts that you require. Make those little tweaks and changes that you inevitably have to make. If you want to make changes to the default user profile, follow the instructions on this very fine website¬†to do so.

Once you’ve got your software installed and have got your profile set up the way you want it, you may want to boot your Mac into Target Mode and use your Server to create a snapshot using the System Image Utility or Disk Utility. This is optional but recommended as you will need to do a lot of testing which may end up being destructive if you make a mistake. Making an image now will at least allow you to roll back without having to start from scratch.

Creating an Automator workflow to perform post-image deployment tasks

Now here comes the messy bit. When you deploy your image to your Macs, you will undoubtably want them to automatically configure themselves without any user interaction. The only way that I have found to do this reliably is pretty awful but unfortunately I’ve found it to be¬†necessary.

First of all, you need to enable the root account. The quickest way to do so is to is to open a terminal session and type in the following command:

dsenableroot -u {user with admin rights} -p {that user's password} -r {what you want the root password to be}

Log out and log in with the root user.

Go to System Preferences and go to Users and Groups. Change the Automatic Login option to System Administrator and type in the root password when prompted. When you’ve done that, go to the Security and Privacy section and go to General. Turn on the screensaver password option and set the time to Immediately. Check the “Show a Message…” box and set the lock message to something along the lines of “This Mac is being rebuilt, please be patient”.¬†Close System Preferences for now.

You will need to copy a script from your PMA proxy server called It is located in your %Programfiles(x86)%\Parallels\PMA\files folder. Copy it to the Documents folder of your Root user.

Open your code editor (Xcode if you like, something else if you don’t) and enter the following script:


#Get computer's current name
CurrentComputerName=$(scutil --get ComputerName)

#Bring up a dialog box with computer's name in and give the user the option to change it. Time out after 30secs
ComputerName=$(/usr/bin/osascript <<EOT
tell application "System Events"
set ComputerName to text returned of (display dialog "Please Input New Computer Name" default answer "$CurrentComputerName" with icon 2 giving up after 60)
end tell

#Did the user press cancel? If so, exit the script

echo $ExitCode

if [ $ExitCode = 1 ]
exit 0

#Compare current computername with one set, change if different

CurrentComputerName=$(scutil --get ComputerName)
CurrentLocalHostName=$(scutil --get LocalHostName)
CurrentHostName=$(scutil --get HostName)

echo "CurrentComputerName = $CurrentComputerName"
echo "CurrentLocalHostName = $CurrentLocalHostName"
echo "CurrentHostName = $CurrentHostName"

 if [ $ComputerName = $CurrentComputerName ]
 echo "ComputerName Matches"
 echo "ComputerName Doesn't Match"
 scutil --set HostName $ComputerName
 echo "ComputerName Set"

 if [ $ComputerName = $CurrentHostName ]
 echo "HostName Matches"
 echo "HostName Doesn't Match"
 scutil --set ComputerName $ComputerName
 echo "HostName Set"

 if [ $ComputerName = $CurrentLocalHostName ]
 echo "LocalHostName Matches"
 echo "LocalHostName Doesn't Match"
 scutil --set LocalHostName $ComputerName
 echo "LocalHostName Set"

#Invoke Screensaver

#Join Domain
dsconfigad -add {FQDN.of.your.AD.domain} -user {User with join privs} -password {password for user} -force

#disable automatic login
defaults delete /Library/Preferences/ autoLoginUser
rm /etc/kcpassword

#install Configuration Manager client
chmod 755 /private/var/root/Documents/
/private/var/root/Documents/ http://FQDN.of.your.PMA.Server:8761/files/pma_agent.dmg {SCCM User} {Password for SCCM User} {FQDN.of.your.AD.Domain}
echo SCCM Client Installed

#Repair disk permissions
diskutil repairPermissions /
echo Disk Permissions Repaired

#Rename boot volume to host name
diskutil rename "Macintosh HD" $HOSTNAME

#disable root
dsenableroot -d -u {User with admin rights on Mac} -p {That user's password}

#Reboot Mac
shutdown -r +60

Obviously you will need to change this to suit your environment.

As you can see, this has several parts. It calls a bit of Applescript which prompts the user to enter the machine name. The default value¬†is the Mac’s current hostname. The prompt times out after 30 seconds. The script gets the current hostname of the machine and compares it to what was entered in the box and changes the Mac’s name if it is different. It then¬†invokes the Mac’s screensaver, joins the Mac to your AD domain, renames the Mac’s hard drive to the hostname of the Mac and downloads the PMA client from the PMA Proxy Server and installs it. It removes the automatic logon for the Root user, removes the saved password for Root, runs a Repair Permissions on the Mac’s hard disk then disables the Root account and sets the Mac to reboot itself after 60 minutes. The Mac is given an hour before it reboots so that the PMA can download and apply its initial policies.

At this point, you will probably want to test the script to make sure that it works. This is why I suggested taking a snapshot of your Mac beforehand. Even if you do get it right, you still need to roll back your Mac to how it was before you ran the script.

Once the script has been tested, you will need to create an Automator workflow. Open the Automator app and create a new application. Go to the Utilities section and drag Shell Script to the pane on the right hand side.

Untitled 7

At this point, you have a choice: You can either paste your entire script into there and have it all run as a big block of code or you can drag multiple shell script blocks across and break your code up into sections. I would recommend the latter approach; it makes viewing the progress of your script a lot easier and if you make a mistake in your script blocks, it makes it easier to track where the error is. When you’re finished, save the workflow application in the Documents folder. I have uploaded an anonymised version of my workflow:¬†Login Script.

Finally, open System Preferences again and go to the Users and Groups section. Click on System Administrator and go to Login Items. Put the Automator workflow you created in as a login item. When the Mac logs in for the first time after its image is deployed, it will automatically run your workflow.

I’m sure you’re all thinking that I’m completely insane for suggesting that you do this but as I say, this is the only way I’ve found that reliably works. I tried using loginhooks and a login script set with a profile but those were infuriatingly unreliable. I considered editing the sudoers file to allow the workflow to work as Root without having to enter a password but I decided that was a long term security risk not worth taking. I have tried to minimise the risk of having Root log on automatically as much as possible; the desktop is only interactive for around 45-60 seconds before the screensaver kicks in and locks the machine out for those who don’t have the root password.¬†Even for those who do have the root password, the Root account¬†is only active for around 5-10 minutes until the workflow disables¬†it¬†after after¬†the Repair Disk Permissions command has finished.

Anyway, once that’s all done reboot the Mac into Target mode and connect it to your Mac running OS X Server.

Use the System Image Utility to create a Netboot image of your Mac with a workflow to deploy it.

There is a surprising lack of documentation on Internet about the System Image Utility. I suppose that’s because it’s so bare bones and that most people use other solutions such as DeployStudio to deploy their Macs.¬†I eventually managed to find some and this is what I’ve managed to cobble together.

On the Mac running OS X Server, open the Server utility and enter your username and password when prompted. When the OS X Server app finishes loading, go to the Tools menu and click on System Image Utility. This will open another app which will appear in your dock; if you see yourself using this app a lot, you can right click on it and tell it to stay in your dock.

siu 1

Anyway, once the System Image Utility loads click on the Customize button. That will bring up a workflow window similar to Automator’s.


The default workflow has two actions¬†in¬†it: Define Image Source and Create Image. Just using these will create a working image but it will not have any kind of automation; the Mac won’t partition its hard drive or name itself automatically. To get this to work, you need to add a few more actions.

There will be a floating window with the possible actions for the System Image Utility open. Find the following three actions and add them to the workflow between the Define Image Source and Create Image actions. Make sure that you add them in the following order:

  1. Partition Disk
  2. Enable Automated Installation
  3. Apply System Configuration Settings

You can now configure the workflow actions themselves.

For the Define Image Source action, change the Source option to the Firewire/Thunderbolt target drive.

For the Partition Disk action, choose the “1 Partition” option and check the “Partition the first disk found” and, optionally, “Display confirmation dialog before partitioning”. Checking the second box will give you a 30 second opportunity to create a custom partition scheme when you start the imaging process on your Mac clients. Choose a suitable name for the boot volume and make sure that the disk format is “Mac OS Extended (Journaled)”

For the Enable Automated Installation action, put the name of the volume that you want the OS to be installed to into the box and check the “Erase before installing” box. Change the main language if you don’t want your Macs to install in English.

The¬†Apply System Configuration Settings action is a little more complicated. This is the section which names your Macs. To do this, you need to provide a properly formatted text file with the Mac’s MAC address and its name. Each field is separated with a tab and there is no header line. Save the file somewhere (I’d suggest in your user’s Documents folder) and put the full path to the file including the file name into the “Apply computer name…” box. There is an option in this action which is also supposed to join your Mac to a directory server but I could never get this to work no matter what I tried so leave that one alone.

The last action is Create Image. Make sure that the Type is NetRestore and check the Include Recovery Partition box. You need to put something into the Installed Volume box but it doesn’t appear to matter what. Put a name for the image into the Image Name and Network Disk boxes and choose a destination to save the image to. I would suggest saving it directly to the /{volume}/Library/Netboot/NetbootSP0 folder as it will appear as a bootable image as soon as the image snapshot has been taken without you having to move or copy it to the correct location.

Once you’ve filled out the form, press the Save button to save your workflow then press Run. The System Image Utility will then generate your image ready for you to test. Do your best to make sure that you get all of this right; if you make any mistakes you will have to correct them and run the image creation workflow again, even if it is just a single setting or something in your script that’s wrong. The other problem with this is that if you add any new Macs to your estate you’ll have to update the text file with the Mac’s names and MAC addresses in and re-create the image again. This is why I put the “Name your Mac” section into the script.

Test the image

The next step now is to test your Netboot image. To do so, connect your Client Mac to the same network segment as your Server. Boot it to the desktop and open System Preferences. Go to to the Startup Disk pane and you should see the image that you just created as an option


Click on it and press the Restart button. The Mac will boot into the installation environment and run through its workflow. When it’s finished, it will automatically log on as the Root user and run the login script that you created in a previous step.

Convert the image to a WIM and add it to your OSD Image Library

Once you’re satisfied that the image and the login script runs to your satisfaction, you need to add your image to the ConfigMgr¬†image library. Unfortunately, ConfigMgr doesn’t understand what an NBI is so we need to wrap it up into a WIM file.

To convert the image to a WIM file, first of all copy the NBI file to a suitable location on your PMA Proxy Server. Log onto the PMA Proxy using Remote Desktop and open the ConfigMgr client. Go to the Software Library workspace and Operating Systems then Operating System Images. Right click on Operating System Images and click on “Add Mac OS X Operating System Image”.

nbi convert

Click on the first browse button and go the location where you copied the NBI file to. This must be a local path, not a UNC.

Click on the second browse button and go to the share that you defined when you installed the Netboot agent on your PMA Proxy. This must be a UNC, not a local path. Press the Next button and wait patiently while the NBI image is wrapped up into a WIM file. When the process is finished, the image will be in your Operating System Images library. There is a minor bug here: If you click on a folder underneath the Image library, the image will still be added to the root of the library and not in the folder you selected. There’s nothing stopping you moving it afterwards but this did confuse me a little the first time I came across it. Once the image is added, you should copy it to a distribution point.

Advertise the image to your Macs

Nearly finished!

The final steps are to create a task sequence then deploy the task sequence to a collection. To create the task sequence, open the ConfigMgr console on a PC which has the Parallels console extension installed. Go to the Software Library workspace and Operating Systems. Under there, go to Task Sequences and right click on Task Sequences. Select “Create Task Sequence for Macs” and this will appear:


Put in a name for the task sequence then press the Browse button. After a small delay, a list of the available OS X images will appear. Choose the one that you want and press the Finish button. The task sequence will then appear in your sequence library but like with the images, it will appear in the root rather than in a specific folder. The only step left is to deploy the task sequence to a collection; the process for this is identical to the one for Windows PCs. I don’t know if it’s necessary but I always deploy the sequence to the Unknown Computers collection as well as the collections that the Macs sit in, just to be sure that new Macs get it as well.

Assuming that you have set up the Netboot server on the PMA Proxy properly, all of the Macs which are in the collection(s) you advertised the image to will have your image as a boot option. Good luck and have fun!


Despite me spending literally weeks writing this almost 4,000 word long blog post when I had the time and inclination to do so, it is worth¬†mentioning again that all of this is going to be obsolete very soon. The next version of the Parallels agent is going to support for proper task sequencing in it. My contact within Parallels tells me that they are mimicking Microsoft’s task sequence UI so that you can deploy software and settings during the build process and that there will be a task sequence wizard on the Mac side which will allow you to select a task sequence to run. I’m guessing (hoping!) that will be in the existing Parallels Application Portal where you can install optional applications from.

Kindness of Strangers

So, I was out cycling this evening. I decided to take my bike up to Someries Castle¬†because I’ve driven past the brown sign pointing at it on my way to work every day for the last two years and I was curious to see what, exactly, was¬†there. The answer is not very much but I digress. The castle is on some land next to a farm and the track that approaches it is very rough. I picked up a puncture there. It was a big one and I couldn’t get enough air into the tyre with my hand pump to get myself home. Of course I stupidly didn’t have any spare tubes or a puncture repair kit on me so ¬†I faced a five mile walk on Cycle Route 6 to get home.

Just under two miles into my walk, another cyclist passed me. He asked me if I was OK and I asked him if he had a puncture repair kit on him. He said no but that his house was just around the corner. He said that he had one there and that I was welcome to repair my bike at his home. I got to his garage and he offered me a spare tube and refused payment for it. He lent me some tyre levers and a pump and we had a brief chat with me about the area, the cycle track between Luton and Harpenden and how we use our bikes.

I thanked him profusely when I finished fixing my bike but I’d like to do so again publicly so to the very nice man who helped me when I needed it: THANK YOU.

The lessons that I’m going to take from this are as follows:

  1. Carry a puncture repair kit or spare tubes with you. Some CO2 tubes are a good idea too. Walking miles home pushing a bike is no fun.
  2. Help people who need it. I intend to carry this man’s kindness forwards; if I ever come across a fellow cyclist in distress I will help them in the same way he helped me.

I’m not going to be trite and say that this restored my faith in humanity or something cheesy like that but it was good to see that there are some decent people out there who will help you for the sake of helping you.

Me – A Progress Report

It’s been a month or thereabouts since I made the post about trying to get healthier. This is how it’s gone so far:

So, lets start with the bits where I’ve been reasonably good. In the space of a month, I have biked about 50 miles, been swimming once and swam about 600m, I have been talking walks with my colleagues at lunchtime and¬†have walked somewhere between¬†15 to 20 miles during those times. Over the easter break, my girlfriend and I went on holiday to Holland (yes, actual Holland, not just the Netherlands) and we must have walked three laps around the centre of Amsterdam. I’ve done a fair amount of exercise.

On the downside, I wanted to do considerably more exercise in that time (50 miles on a bike and 600m of swimming in a month is pretty pathetic really) but a bout of illness and my time on holiday put a crimp in those plans.

I need to work harder on my diet. I have improved my breakfasts a little and my evening meals are not excessive but it’s during the working day that I need to be better. I don’t usually give myself enough time in the mornings to prepare decent lunches for myself so that inevitably means that I need to buy lunch at work. I work for a sixth form college and as much as it distresses the catering staff there, the meals that they serve are, shall we say, suboptimal. Please don’t interpret that as an attack on our catering staff, they are hard working and very skilled. The trouble is that¬†when the catering staff prepare healthy meals for our students, they don’t buy them. I suppose it’s better to serve them chips and burgers¬†and at least get them fed than it is to waste money on healthy food which doesn’t get sold and leave them hungry. This means that my choices when I’m buying food at work are rather limited. I’ve also learned through experience that I can’t eat in the cafeteria because when I try, I end up getting interrupted by a member of staff who thinks that their IT problems are more important than my lunch break and I end up abandoning it.¬†¬†They have a food shop on campus. They sell sandwiches but they’re¬†generally ones which I don’t like, they either have fillings¬†which are not to my taste or they’re lathered with mayonnaise which I loathe.¬†The shop is also loaded with¬†confectionary and sugary drinks. There is a fruit basket at the till but¬†there is generally only about a 50% chance that I’ll find something in there which looks edible. Pretty much the only place where¬†I can buy filling food at college which I can eat at my desk or away from the servery is the cafe which sells¬†panini, sausage rolls, pastries and cakes. Ultimately, it doesn’t make for a very good lunch.

My diet while I was on holiday¬†wasn’t very good either, we ended up having chips or other junk food most days and we even stopped at a McDonald’s at a motorway service station because we were desperate and there was nothing else. I tell you, I wasn’t expecting much from it but amazingly they still ended up disappointing me. That burger was vile.¬†It’s a mystery to me how McDonald’s are so popular.

Anyway, over the last month I’ve lost about half a kilogram. It’s progress but not as much as I wanted. I want to lose at least 15KG (32lb, 3 stone, 4lb) more and 4″ from my waistline and I am determined to do it. Again. I just hope it won’t take me 30 months to do!


Over the last few years, I have had… issues… with my weight. I have never been morbidly obese but I have been bigger and heavier than I’d like to be. A few years ago, I lost a substantial amount of weight for reasons that I eventually put down to stress; I was in a job that I disliked intensely and pretty unhappy on a personal level too. I did come out of the other side of it and had the unexpected benefit of loosing about eight inches from my waistline and about 20-25KG of weight.

Since then, I’ve gained, lost and gained weight again. I’m nowhere near as big as I was at my heaviest but I’m still¬†on the wrong side of 100KG and some of my¬†clothes are starting to get uncomfortable.¬†I need to do something about this. So, I am going to start tracking what I’m eating. I’m going to cut the crap out of my diet and start riding my bike on a regular basis. I might even start making my eightish mile journey to work on my bike instead of driving it. 16 miles a day? ¬†A tough order at the moment but if I can get my fitness up, I’ll see the benefits.

Anyway, I’m going to start posting my progress on here in the vague hope that making it public will spur me on and keep me on the straight and narrow. Wish me luck!

The Grand(ish) Experiment – The latest

Well, I’m no longer using the Venue as my daily driver. I liked the tablet and it proved itself perfectly capable of handling my workload. However, it’s now time for some the intended recipients of these things to try them out and see how they get on with them. To that end, I have given the four tablets¬†to the occupants of one of our teaching rooms and we intend to set up a¬†docking station in the room and connect all of their equipment to it. The four teachers have varying levels¬†of computer confidence ranging from the high to the low¬†so hopefully we should get a fair idea of how viable this little project will be.

We have however hit a snag. I actually tried to put the docking station in the room yesterday and connect it to a VGA projector. Unfortunately the docked tablet didn’t detect the projector when the dock¬†was plugged into the projector using¬†via the fixed¬†VGA cable and the DisplayPort to VGA adaptor. They worked together¬†happily enough when they were connected with¬†a short 2m VGA cable so I don’t think it’s¬†a question of compatibility as such. I think it’s down to either a faulty fixed VGA cable, the VGA DisplayPort to VGA adapter¬†not outputting a powerful enough signal for the projector to pick up or the known problems with the A00 revision dock which I talked about before. We have a few more docks on order which will hopefully be the new revision and a Dell sanctioned DisplayPort to VGA adapter. When they arrive, we will give them a go and see if they’re any better.¬†Hopefully we’ll be able to work around these little problems and get a setup working in a classroom for our teachers to experiment with.

On a related tangent, when we ordered our Venue 11 Pros we also ordered a couple of Venue 8 Pros as well with a view of seeing how they behaved for students. I borrowed one last week to go on a training course with. Microsoft no longer seem to be giving away paper literature with their courses, they are using¬†electronic books instead. I hoped that they’d issue the books in PDF format which would have let me import them into OneNote and make notes on them. Sadly Microsoft use a proprietary courseware reader from a company called SkillPipe¬†who use their own encrypted file format so I couldn’t do exactly what I wanted.

However, it did give me some time to get more closely acquainted with this tablet. Despite not being able to scribble notes onto the book, it still acted as a pretty good courseware¬†reader with their Windows 8 app and a good ebook reader for when I was on the train. The tablet is a really nice size and weight and it feels well balanced¬†in the hand when used in portrait mode. The performance of the tablet won’t set the world on fire but it has enough grunt to run the Office suite in its entirety, it was quite happy running the Modern Mode apps installed on it and surprisingly, it even made a decent fist of running PhotoShop CC 2014. It was quite fun using it to scribble. To my considerable surprise, after cursing and swearing at the Windows 8 interface on my work desktop for so long, I actually started to enjoy using it on the tablet. The swipes, the charms bar, the multitasking panes and the task switching interface all made sense when you poke the screen instead rather than¬†use a keyboard and mouse. I think that if Microsoft had taken a similar route to Apple and decided to have¬†separate OSes for desktops and tablets, the market would be looking very different right now. But I digress.

My only complaints would be the sad lack of apps on the Windows Store and the relatively low resolution¬†screen. Google don’t have any official Microsoft apps so there is no official YouTube player, no Google Movies or Music, no Maps. Yes, there are the Microsoft equivalents but I didn’t think much of those. The availability of first party apps from other providers was pretty slim too; no Instagram, no Feedly, no third party browsers which use the Modern interface. Even where there were apps, they seems functionally poor compared to their iOS and Android cousins; I tried the Windows version of Tapatalk and it was just awful. There was, however, a decent Kindle app which I took full advantage of. Granted, a lot of these things could be accessed through the browser but I found that most websites were rendered in Desktop mode and were a bit too small to be usable with your fingers.

My other complaint was the screen. It is a 1280×800 IPS screen. The colour was good, viewing angles and brightness were excellent but after getting used to the Retina display on my iPhone and the 1080p display on the Venue 10s, the 8″ display just looked crap. A 1440×900 or 1080p screen would have been a massive improvement but I guess the GPU in the Atom Z3470D CPU isn’t powerful enough to run a display that size.

Anyway, despite all that I came away feeling pretty impressed with the Dell Venue 8 Pro and if I had a spare couple of hundred quid to spend it’d be on my list to consider.

DCM Scripts – Checking Windows Activation Status

My last script was, out of necessity, a rather laborious one. Using a VBScript to check a status, generating a file from its output, reading files, creating objects and properties etc. Luckily enough, checking Windows activations (and checking Office activations on Windows 8.1) is considerably easier.

There is a WMI class called SoftwareLicensingProduct which is where Windows just happens store the activation status for itself and on Windows 8, for Office as well.

To detect the activation status for Windows itself, use this:

$WindowsActivationStatus = Get-CimInstance SoftwareLicensingProduct -Filter "Description LIKE '%KMSCLIENT%' and Name LIKE '%Windows%'" | select ID, Description, LicenseStatus, Name, GenuineStatus

if ($WindowsActivationStatus.LicenseStatus -eq "1") {
 echo "Windows is Activated"
 else {
 echo "Windows is not activated"

And use this as a remediation script:

c:\windows\system32\cscript.exe c:\windows\system32\slmgr.vbs /skms
c:\windows\system32\cscript.exe c:\windows\system32\slmgr.vbs /ato

Strictly speaking, the first line shouldn’t be¬†necessary if you’ve set KMS up properly but including it¬†does at least force the machine to look at the correct server for activation.

Set up a compliance rule to look for a string which says “Windows is activated”, create a new baseline or add it to a new one and deploy to a collection.



%d bloggers like this: