Tag Archives: Microsoft

SCOM – SQL Management Pack Configuration

SCOM is a bastard of a product, isn’t it? It’s even more so¬†when you’re trying to monitor a SQL instance or two. It’s also quite amusing that Chrome doesn’t recognise SCOM as a word in its dictionary and that it suggests SCAM as a possible word ūüôā

My major project at work for the past few months has been SCOM. I am monitoring about 300 Windows VMs, about a third of which have a SQL database instances on them. I’ve kept with using the LocalSystem account as the SCOM action account and for the majority of the time, that’s enough. However, there have been a few times where it hasn’t been enough. It’s always a permissions issue, the LocalSystem account doesn’t have access to one or more of the databases so the discovery and monitoring scripts can’t run and you get a myriad of alerts.

When it comes to adding a management pack into SCOM,¬†always read the damn documentation that comes with the MP. I know it’s tedious but it’s necessary. Reading the documentation for the SQL Management pack found at Microsoft’s website¬†gives you some interesting recommendations. They suggest that you have three action accounts for SQL:

  1. A discovery account
  2. A default action account
  3. A monitoring account

They also recommend that you put the monitoring and discovery account into an additional AD group. Once you do that, you have to add the users to SQL, assign them specific permissions to databases, give them access to parts of the Windows registry, assign them permissions to various WMI namespaces, grant them local logon privileges and more. I’m not going to go over the whole process, if you really want to see it look at Microsoft’s documentation.

The point is, it’s a lot of work. Wouldn’t it be nice if we could automate it? Well, I’ve written a script that does precisely that. It’s a big one:


Function Set-WmiNamespaceSecurity
{
[cmdletbinding()]
Param ( [parameter(Mandatory=$true,Position=0)][string] $namespace,
[parameter(Mandatory=$true,Position=1)][string] $operation,
[parameter(Mandatory=$true,Position=2)][string] $account,
[parameter(Position=3)][string[]] $permissions = $null,
[bool] $allowInherit = $false,
[bool] $deny = $false,
[string] $computerName = ".",
[System.Management.Automation.PSCredential] $credential = $null)

Process {
#$ErrorActionPreference = "Stop"

Function Get-AccessMaskFromPermission($permissions) {
$WBEM_ENABLE = 1
$WBEM_METHOD_EXECUTE = 2
$WBEM_FULL_WRITE_REP = 4
$WBEM_PARTIAL_WRITE_REP = 8
$WBEM_WRITE_PROVIDER = 0x10
$WBEM_REMOTE_ACCESS = 0x20
$WBEM_RIGHT_SUBSCRIBE = 0x40
$WBEM_RIGHT_PUBLISH = 0x80
$READ_CONTROL = 0x20000
$WRITE_DAC = 0x40000

$WBEM_RIGHTS_FLAGS = $WBEM_ENABLE,$WBEM_METHOD_EXECUTE,$WBEM_FULL_WRITE_REP,`
$WBEM_PARTIAL_WRITE_REP,$WBEM_WRITE_PROVIDER,$WBEM_REMOTE_ACCESS,`
$READ_CONTROL,$WRITE_DAC
$WBEM_RIGHTS_STRINGS = "Enable","MethodExecute","FullWrite","PartialWrite",`
"ProviderWrite","RemoteAccess","ReadSecurity","WriteSecurity"

$permissionTable = @{}

for ($i = 0; $i -lt $WBEM_RIGHTS_FLAGS.Length; $i++) {
$permissionTable.Add($WBEM_RIGHTS_STRINGS[$i].ToLower(), $WBEM_RIGHTS_FLAGS[$i])
}

$accessMask = 0

foreach ($permission in $permissions) {
if (-not $permissionTable.ContainsKey($permission.ToLower())) {
throw "Unknown permission: $permission`nValid permissions: $($permissionTable.Keys)"
}
$accessMask += $permissionTable[$permission.ToLower()]
}

$accessMask
}

if ($PSBoundParameters.ContainsKey("Credential")) {
$remoteparams = @{ComputerName=$computer;Credential=$credential}
} else {
$remoteparams = @{ComputerName=$computerName}
}

$invokeparams = @{Namespace=$namespace;Path="__systemsecurity=@"} + $remoteParams

$output = Invoke-WmiMethod @invokeparams -Name GetSecurityDescriptor
if ($output.ReturnValue -ne 0) {
throw "GetSecurityDescriptor failed: $($output.ReturnValue)"
}

$acl = $output.Descriptor
$OBJECT_INHERIT_ACE_FLAG = 0x1
$CONTAINER_INHERIT_ACE_FLAG = 0x2

$computerName = (Get-WmiObject @remoteparams Win32_ComputerSystem).Name

if ($account.Contains('\')) {
$domainaccount = $account.Split('\')
$domain = $domainaccount[0]
if (($domain -eq ".") -or ($domain -eq "BUILTIN")) {
$domain = $computerName
}
$accountname = $domainaccount[1]
} elseif ($account.Contains('@')) {
$domainaccount = $account.Split('@')
$domain = $domainaccount[1].Split('.')[0]
$accountname = $domainaccount[0]
} else {
$domain = $computerName
$accountname = $account
}

$getparams = @{Class="Win32_Account";Filter="Domain='$domain' and Name='$accountname'"}

$win32account = Get-WmiObject @getparams

if ($win32account -eq $null) {
throw "Account was not found: $account"
}

switch ($operation) {
"add" {
if ($permissions -eq $null) {
throw "-Permissions must be specified for an add operation"
}
$accessMask = Get-AccessMaskFromPermission($permissions)

$ace = (New-Object System.Management.ManagementClass("win32_Ace")).CreateInstance()
$ace.AccessMask = $accessMask
if ($allowInherit) {
$ace.AceFlags = $OBJECT_INHERIT_ACE_FLAG + $CONTAINER_INHERIT_ACE_FLAG
} else {
$ace.AceFlags = 0
}

$trustee = (New-Object System.Management.ManagementClass("win32_Trustee")).CreateInstance()
$trustee.SidString = $win32account.Sid
$ace.Trustee = $trustee

$ACCESS_ALLOWED_ACE_TYPE = 0x0
$ACCESS_DENIED_ACE_TYPE = 0x1

if ($deny) {
$ace.AceType = $ACCESS_DENIED_ACE_TYPE
} else {
$ace.AceType = $ACCESS_ALLOWED_ACE_TYPE
}

$acl.DACL += $ace.psobject.immediateBaseObject
}

"delete" {
if ($permissions -ne $null) {
throw "Permissions cannot be specified for a delete operation"
}

[System.Management.ManagementBaseObject[]]$newDACL = @()
foreach ($ace in $acl.DACL) {
if ($ace.Trustee.SidString -ne $win32account.Sid) {
$newDACL += $ace.psobject.immediateBaseObject
}
}

$acl.DACL = $newDACL.psobject.immediateBaseObject
}

default {
throw "Unknown operation: $operation`nAllowed operations: add delete"
}
}

$setparams = @{Name="SetSecurityDescriptor";ArgumentList=$acl.psobject.immediateBaseObject} + $invokeParams

$output = Invoke-WmiMethod @setparams
if ($output.ReturnValue -ne 0) {
throw "SetSecurityDescriptor failed: $($output.ReturnValue)"
}
}
}

Function Add-DomainUserToLocalGroup
{
[cmdletBinding()]
Param(
[Parameter(Mandatory=$True)]
[string]$computer,
[Parameter(Mandatory=$True)]
[string]$group,
[Parameter(Mandatory=$True)]
[string]$domain,
[Parameter(Mandatory=$True)]
[string]$user
)
$de = [ADSI]‚ÄúWinNT://$computer/$Group,group‚ÄĚ
$de.psbase.Invoke(‚ÄúAdd‚ÄĚ,([ADSI]‚ÄúWinNT://$domain/$user‚ÄĚ).path)
} #end function Add-DomainUserToLocalGroup

Function Add-UserToLocalLogon
{
[cmdletBinding()]
Param(
[Parameter(Mandatory=$True)]
[string]$UserSID
)
$tmp = [System.IO.Path]::GetTempFileName()
secedit.exe /export /cfg "$($tmp)"
$c = Get-Content -Path $tmp
$currentSetting = ""

foreach($s in $c) {
if( $s -like "SeInteractiveLogonRight*") {
$x = $s.split("=",[System.StringSplitOptions]::RemoveEmptyEntries)
$currentSetting = $x[1].Trim()
}
}

if( $currentSetting -notlike "*$($UserSID)*" ) {
if( [string]::IsNullOrEmpty($currentSetting) ) {
$currentSetting = "*$($UserSID)"
} else {
$currentSetting = "*$($UserSID),$($currentSetting)"
}

$outfile = @"
[Unicode]
Unicode=yes
[Version]
signature="`$CHICAGO`$"
Revision=1
[Privilege Rights]
SeInteractiveLogonRight = $($currentSetting)
"@

$tmp2 = [System.IO.Path]::GetTempFileName()

$outfile | Set-Content -Path $tmp2 -Encoding Unicode -Force

Push-Location (Split-Path $tmp2)

try {
secedit.exe /configure /db "secedit.sdb" /cfg "$($tmp2)" /areas USER_RIGHTS

} finally {
Pop-Location
}
}
}

#Set Global Variables

$Default_Action_Account = "om_aa_sql_da"
$Discovery_Action_Account = "om_aa_sql_disc"
$Monitoring_Action_Account = "om_aa_sql_mon"
$LowPrivGroup = "SQLMPLowPriv"

$WindowsDomain = "Intranet"
#Add users to local groups

Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Performance Monitor Users" -user $Monitoring_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Performance Monitor Users" -user $Default_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Event Log Readers" -user $Monitoring_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Event Log Readers" -user $Default_Action_Account -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Users" -user $LowPrivGroup -domain $WindowsDomain
Add-DomainUserToLocalGroup -computer $env:COMPUTERNAME -group "Users" -user $Default_Action_Account -domain $WindowsDomain
#
#AD SIDs for Default Action Account user and Low Priv group - required for adding users to local groups and for service security settings.

#Define SIDs for Default Action and Low Priv group. To get a SID, use the following command:
#Get-ADUser -identity [user] | select SID
#and
#Get-ADGroup -identity [group] | select SID
#Those commands are part of the AD management pack which is why they're not in this script, I can't assume that this script is being run on a DC or on
#a machine with the AD management shell installed
#>

$SQLDASID = "S-1-5-21-949506055-860247811-1542849698-1419242"
$SQLMPLowPrivsid = "S-1-5-21-949506055-860247811-1542849698-1419239"

Add-UserToLocalLogon -UserSID $SQLDASID
Add-UserToLocalLogon -UserSID $SQLMPLowPrivsid

#Set WMI Namespace Security

Set-WmiNamespaceSecurity root add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\cimv2 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\default add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement10'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement10 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity }
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement11'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement11 add $WindowsDomain\$Default_Action_Account MethodExecute,Enable,RemoteAccess,Readsecurity }

Set-WmiNamespaceSecurity root add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\cimv2 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
Set-WmiNamespaceSecurity root\default add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement10'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement10 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity }
if (Get-WMIObject -class __Namespace -namespace root\microsoft\sqlserver -filter "name='ComputerManagement11'") {
Set-WmiNamespaceSecurity root\Microsoft\SqlServer\ComputerManagement11 add $WindowsDomain\$LowPrivGroup MethodExecute,Enable,RemoteAccess,Readsecurity }

#Set Registry Permissions

$acl = Get-Acl 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($Default_Action_Account)","readkey","ContainerInherit","None","Allow")
$acl.SetAccessRule($Rule)
$acl | Set-Acl -Path 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$acl = $null
$acl = Get-Acl 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($LowPrivGroup)","readkey","ContainerInherit","None","Allow")
$acl.SetAccessRule($Rule)
$acl | Set-Acl -Path 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server'
$acl = $null

$SQLInstances = Get-ChildItem 'registry::hklm\SOFTWARE\Microsoft\Microsoft SQL Server' | ForEach-Object {Get-ItemProperty $_.pspath } | Where-Object {$_.pspath -like "*MSSQL1*" }

$SQLInstances | Foreach {
$acl = Get-Acl "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($LowPrivGroup)","readkey","ContainerInherit","None","Allow")
$acl.SetAccessRule($Rule)
$acl | Set-Acl -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$acl = $null

$acl = Get-Acl "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$Rule = New-Object System.Security.AccessControl.RegistryAccessRule ("$($WindowsDomain)\$($Default_Action_Account)","readkey","ContainerInherit","None","Allow")
$acl.SetAccessRule($Rule)
$acl | Set-Acl -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$($_.PSChildName)\MSSQLSERVER\Parameters"
$acl = $null

}

#Set SQL Permissions

#Get SQL Version
if ($SQLInstances.Count -eq $null) {

$version = Get-ItemProperty "registry::HKLM\Software\Microsoft\Microsoft SQL Server\$($SQLInstances.PSChildName)\MSSQLSERVER\CurrentVersion"

} else {

$version = Get-ItemProperty "registry::HKLM\Software\Microsoft\Microsoft SQL Server\$($SQLInstances[0].PSChildName)\MSSQLSERVER\CurrentVersion"

}
#Import appropriate SQL PowerShell module

if ($version.CurrentVersion -ge 11) {
#Import SQL 2012 Module
Import-Module sqlps
#change out of sql context
c:
} else {
#Add SQL 2008 Snap-in
Add-PSSnapin SqlServerCmdletSnapin100
Add-PSSnapin SqlServerProviderSnapin100
}

#Create database users and assign permissions

$CreateDatabaseUsers = "use master
go

create login [$($WindowsDomain)\$($LowPrivGroup)] from windows
go

grant view server state to [$($WindowsDomain)\$($LowPrivGroup)]
grant view any definition to [$($WindowsDomain)\$($LowPrivGroup)]
grant view any database to [$($WindowsDomain)\$($LowPrivGroup)]
grant select on sys.database_mirroring_witnesses to [$($WindowsDomain)\$($LowPrivGroup)]
go

create login [$($WindowsDomain)\$($Default_Action_Account)] from windows
go

grant view server state to [$($WindowsDomain)\$($Default_Action_Account)]
grant view any definition to [$($WindowsDomain)\$($Default_Action_Account)]
grant view any database to [$($WindowsDomain)\$($Default_Action_Account)]
grant alter any database to [$($WindowsDomain)\$($Default_Action_Account)]
grant select on sys.database_mirroring_witnesses to [$($WindowsDomain)\$($Default_Action_Account)]
go"

#Generate query to assign users and permissions to databases
$DatabaseUsers1 = "SELECT 'use ' + name + ' ;'
+ char(13) + char(10)
+ 'create user [$($WindowsDomain)\$($LowPrivGroup)] FROM login [$($WindowsDomain)\$($LowPrivGroup)];'
+ char(13) + char(10) + 'go' + char(13) + char(10)
FROM sys.databases WHERE database_id = 1 OR database_id >= 3
UNION
SELECT 'use msdb; exec sp_addrolemember @rolename=''SQLAgentReaderRole'', @membername=''$($WindowsDomain)\$($LowPrivGroup)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
UNION
SELECT 'use msdb; exec sp_addrolemember @rolename=''PolicyAdministratorRole'', @membername=''$($WindowsDomain)\$($LowPrivGroup)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
"

$DatabaseUsers2 = "SELECT 'use ' + name + ' ;'
+ char(13) + char(10)
+ 'create user [$($WindowsDomain)\$($Default_Action_Account)] FROM login [$($WindowsDomain)\$($Default_Action_Account)];'
+ 'exec sp_addrolemember @rolename=''db_owner'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'';'
+ 'grant alter to [$($WindowsDomain)\$($Default_Action_Account)];'
+ char(13) + char(10) + 'go' + char(13) + char(10)
FROM sys.databases WHERE database_id = 1 OR database_id >= 3
UNION
SELECT 'use msdb; exec sp_addrolemember @rolename=''SQLAgentReaderRole'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
UNION
SELECT 'use msdb; exec sp_addrolemember @rolename=''PolicyAdministratorRole'', @membername=''$($WindowsDomain)\$($Default_Action_Account)'''
+ char(13) + char(10) + 'go' + char(13) + char(10)
"

#
$SQLInstances | Foreach {
if ($_.PSChildName.split('.')[-1] -eq "MSSQLSERVER") {
$InstanceName = $env:COMPUTERNAME
} else {
$InstanceName = "$($env:COMPUTERNAME)\$($_.PSChildName.split('.')[-1])" }

Invoke-Sqlcmd -ServerInstance $InstanceName $CreateDatabaseUsers
$Provision1 = Invoke-Sqlcmd -ServerInstance $InstanceName $DatabaseUsers1
$Provision2 = Invoke-Sqlcmd -ServerInstance $InstanceName $DatabaseUsers2

$Provision1 | foreach {
Invoke-Sqlcmd -ServerInstance $InstanceName $_.ItemArray[0]
}
$Provision2 | foreach {
Invoke-Sqlcmd -ServerInstance $InstanceName $_.ItemArray[0]
}
}

#Grant Default Action account rights to start and stop SQL Services

$SQLServices = Get-Service -DisplayName "*SQL*"

$SQLServices | Foreach {
& c:\windows\system32\sc.exe sdset $_.Name D`:`(A`;`;GRRPWP`;`;`;$($SQLDASID)`)`(A`;`;CCLCSWRPWPDTLOCRRC`;`;`;SY`)`(A`;`;CCDCLCSWRPWPDTLOCRSDRCWDWO`;`;`;BA`)`(A`;`;CCLCSWLOCRRC`;`;`;IU`)`(A`;`;CCLCSWLOCRRC`;`;`;SU`)`S`:`(AU`;FA`;CCDCLCSWRPWPDTLOCRSDRCWDWO`;`;`;WD`)
}

There are huge swathes of this script that I can not take credit for, mostly the functions.

The SetWMINameSpaceSecurity function was pilfered directly from here:¬†https://live.paloaltonetworks.com/t5/Management-Articles/PowerShell-Script-for-setting-WMI-Permissions-for-User-ID/ta-p/53646. I got it from Palo Alto’s website but it appears to have been written by Microsoft themselves

The Add-DomainUserToLocalGroup function was stolen from the Hey, Scripting Guy! Blog, found here: https://blogs.technet.microsoft.com/heyscriptingguy/2010/08/19/use-powershell-to-add-domain-users-to-a-local-group/

The Add-UserToLocalLogon function was lifted wholesale from here: https://ikarstein.wordpress.com/2012/10/12/powershell-script-to-add-account-to-allow-logon-locally-privilege-on-local-security-policy/

The rest, however, is all mine which you can probably tell from the quality of the code. You will need to change some variables from line 223 to match your environment. That said, it works and that’s all I care about. Enjoy!

Sigh, sometimes, software can be too bloody clever for its own good. The Code Block module that I’m using isn’t doing a very good job of formatting this script and it’s replaced some characters such as &, < and > with their HTML equivalents. I¬†think I’ve weeded them all out but I may not have. If not, let me know.

Building, Deploying and Automatically Configuring a Mac Image using SCCM and Parallels SCCM Agent

I touched briefly on using the Parallels Management Agent to build Macs in my overview article but I thought it might be a good idea to go through the entire process that I use when¬†I have to create an image for a¬†Mac, getting the image¬†deployed and getting the Mac configured once the image is on there. At the moment, it’s not a simple process. It requires the use of several tools and, if you want the process to be completely automated, a some¬†Bash scripting as well. The process isn’t as smooth as you would get from solutions like DeployStudio but it works and, in my opinion anyway, it works well enough for you not to have to bother with a separate product for OSD. Parallels are working hard on this part of the product and they tell me that proper task sequencing will be part of V4 of the agent. As much as I’m looking forward to that, it doesn’t change the fact that right¬†now we’re on v3.5 and we have to use the messy process!

First of all, I should say that this is my method of doing it and mine alone. This is not Parallel’s method of doing this, it has not been sanctioned or condoned by them. There are some dangerous elements to it, you follow this procedure at your own risk and I will not be held responsible for damage caused by it if you try this out.

Requirements

You will need the following tools:

  • A Mac running OS X Server. The server needs to be set up as a Profile Manager server, an Open Directory server and, optionally, as a Netboot server. It is also needed on Yosemite for the System Image Utility.
  • A second Mac running the client version of OS X.
  • Both the server and the client need to be running the same version of OS X (Mavericks, Yosemite, whatever) and they need to be patched to the same level. Both Macs need to have either FireWire or Thunderbolt ports.
  • A FireWire or Thunderbolt cable to connect the two Macs together.
  • A SCCM infrastructure with the Parallels SCCM Mac Management Proxy and Netboot server installed.
  • This is optional but I recommend it anyway: ¬†A copy of Xcode or another code editor¬†to create your shell scripts in. I know you could just use TextEdit but I prefer something that has proper syntax highlighting and Xcode is at least free.
  • Patience. Lots of patience. You’ll need it. The process is¬†time consuming and and can be infuriating when you get something wrong.

At the end of this process, you will have an OS X Image which can be deployed to your Macs. The image will automatically name its target,¬†it will download, install and configure the Parallels SCCM agent, join itself to your Active Directory domain, attach itself to a managed wireless network and it will install any additional software that’s not in your base image. The Mac will do this without any user interaction apart from initiating the build process.

Process Overview

The overview of the process is as follows:

  1. Create an OS X profile to join your Mac to your wireless network.
  2. Create a base installation of OS X with the required software and settings.
  3. Create a Automator workflow to deploy the Parallels agent and to do other minor configuration jobs.
  4. Use the System Image Utility to create the image and a workflow to automatically configure the disk layout and computer name.
  5. (Optional) Use the Mac OS X Netboot server to deploy the image to a Mac. This is to make sure that your workflow works and that you’ve got your post-install configuration scripts right before you add the image to your¬†ConfigMgr server. You don’t have to do this but you may find it saves you a lot of time.
  6. Convert the image to a WIM file and add it to your SCCM OSD image library
  7. Advertise the image to your Macs

I’m going to assume that you already have your SCCM infrastructure, Parallels SCCM management proxy, Parallels Netboot server and OS X Server working.

Generate an OS X Profile.

Open a browser and go to the address of your Profile Manager (usually https://{hostname.domain}/profilemanager) and go the Device Groups section.¬†I prefer to generate a profile for each major setting that I’m pushing down. It makes for a little more work getting it set up but if one of your settings breaks something, it makes it easier to¬†troubleshoot as you can remove a specific setting instead of the whole lot at once.

Your profile manager will look something like this:

Untitled

As you can see, I’ve already set up some profiles but I will walk through the process for creating a profile to join your Mac to a wireless network. First of all, create a new device group by pressing the + button in the middle pane. You will be prompted to give the group a name, do so.

Untitled 2

Go to the Settings tab and press the Edit button

Untitled 3

In the General section, change the download type to Manual, put a description in the description field and under the Security section, change the profile removal section to “With Authorisation”. Put a password in the box that appears. Type it in carefully, there is no confirm box.

Untitled 4

If you are using a wireless network which requires certificates, scroll down to the certificates section and copy your certificates into there by dragging and dropping them. If you have an on-site CA, you may as well put the root trust certificate for that in there as well.

Untitled 5

Go to the Networks section and set put in the settings for your network

Untitled 6

When you’re done, press the OK button. You’ll go back to the main Profile Manager screen. Make sure you press the Save button.

I would strongly suggest that you explore Profile Manager and create profiles for other settings as well. For example,¬†you could create one to control your Mac’s energy saving settings or to set up options for your users desktop.

When you’re back on the profile manager window, press the Download button and copy the resulting .mobileconfig file to a suitable network share.

Go to a PC with the SCCM console and the PMA plugin installed. Open the Assets and Compliance workspace. Go to Compliance Settings then Configuration Items. Optionally, if you haven’t already, create a folder for Mac profiles. Right click on your folder or on Configuration Items, go to Create Parallels Configuration Item then Mac OS X Configuration Profile from File.

sccmprof

Give the profile a name and description, change the profile type to System then press the Browse button and browse to the network share where you copied the .mobileconfig file. Double click on the mobileconfig file then press the OK button. You then need to go to the Baselines section and create a baseline with your configuration item in. Deploy the baseline to an appropriate collection.

Create an image

On the Mac which doesn’t have OS X Server installed, install your¬†software. Create any additional local¬†users accounts that you require. Make those little tweaks and changes that you inevitably have to make. If you want to make changes to the default user profile, follow the instructions on this very fine website¬†to do so.

Once you’ve got your software installed and have got your profile set up the way you want it, you may want to boot your Mac into Target Mode and use your Server to create a snapshot using the System Image Utility or Disk Utility. This is optional but recommended as you will need to do a lot of testing which may end up being destructive if you make a mistake. Making an image now will at least allow you to roll back without having to start from scratch.

Creating an Automator workflow to perform post-image deployment tasks

Now here comes the messy bit. When you deploy your image to your Macs, you will undoubtably want them to automatically configure themselves without any user interaction. The only way that I have found to do this reliably is pretty awful but unfortunately I’ve found it to be¬†necessary.

First of all, you need to enable the root account. The quickest way to do so is to is to open a terminal session and type in the following command:

dsenableroot -u {user with admin rights} -p {that user's password} -r {what you want the root password to be}

Log out and log in with the root user.

Go to System Preferences and go to Users and Groups. Change the Automatic Login option to System Administrator and type in the root password when prompted. When you’ve done that, go to the Security and Privacy section and go to General. Turn on the screensaver password option and set the time to Immediately. Check the “Show a Message…” box and set the lock message to something along the lines of “This Mac is being rebuilt, please be patient”.¬†Close System Preferences for now.

You will need to copy a script from your PMA proxy server called InstallAgentUnattended.sh. It is located in your %Programfiles(x86)%\Parallels\PMA\files folder. Copy it to the Documents folder of your Root user.

Open your code editor (Xcode if you like, something else if you don’t) and enter the following script:

#!/bin/sh

#Get computer's current name
CurrentComputerName=$(scutil --get ComputerName)

#Bring up a dialog box with computer's name in and give the user the option to change it. Time out after 30secs
ComputerName=$(/usr/bin/osascript <<EOT
tell application "System Events"
activate
set ComputerName to text returned of (display dialog "Please Input New Computer Name" default answer "$CurrentComputerName" with icon 2 giving up after 60)
end tell
EOT)

#Did the user press cancel? If so, exit the script

ExitCode=$?
echo $ExitCode

if [ $ExitCode = 1 ]
then
exit 0
fi

#Compare current computername with one set, change if different

CurrentComputerName=$(scutil --get ComputerName)
CurrentLocalHostName=$(scutil --get LocalHostName)
CurrentHostName=$(scutil --get HostName)

echo "CurrentComputerName = $CurrentComputerName"
echo "CurrentLocalHostName = $CurrentLocalHostName"
echo "CurrentHostName = $CurrentHostName"

 if [ $ComputerName = $CurrentComputerName ]
 then
 echo "ComputerName Matches"
 else
 echo "ComputerName Doesn't Match"
 scutil --set HostName $ComputerName
 echo "ComputerName Set"
 fi

 if [ $ComputerName = $CurrentHostName ]
 then
 echo "HostName Matches"
 else
 echo "HostName Doesn't Match"
 scutil --set ComputerName $ComputerName
 echo "HostName Set"
 fi

 if [ $ComputerName = $CurrentLocalHostName ]
 then
 echo "LocalHostName Matches"
 else
 echo "LocalHostName Doesn't Match"
 scutil --set LocalHostName $ComputerName
 echo "LocalHostName Set"
 fi

#Invoke Screensaver
/System/Library/Frameworks/ScreenSaver.framework/Resources/ScreenSaverEngine.app/Contents/MacOS/ScreenSaverEngine

#Join Domain
dsconfigad -add {FQDN.of.your.AD.domain} -user {User with join privs} -password {password for user} -force

#disable automatic login
defaults delete /Library/Preferences/com.apple.loginwindow.plist autoLoginUser
rm /etc/kcpassword

#install Configuration Manager client
chmod 755 /private/var/root/Documents/InstallAgentUnattended.sh
/private/var/root/Documents/InstallAgentUnattended.sh http://FQDN.of.your.PMA.Server:8761/files/pma_agent.dmg {SCCM User} {Password for SCCM User} {FQDN.of.your.AD.Domain}
echo SCCM Client Installed

#Repair disk permissions
diskutil repairPermissions /
echo Disk Permissions Repaired

#Rename boot volume to host name
diskutil rename "Macintosh HD" $HOSTNAME

#disable root
dsenableroot -d -u {User with admin rights on Mac} -p {That user's password}

#Reboot Mac
shutdown -r +60

Obviously you will need to change this to suit your environment.

As you can see, this has several parts. It calls a bit of Applescript which prompts the user to enter the machine name. The default value¬†is the Mac’s current hostname. The prompt times out after 30 seconds. The script gets the current hostname of the machine and compares it to what was entered in the box and changes the Mac’s name if it is different. It then¬†invokes the Mac’s screensaver, joins the Mac to your AD domain, renames the Mac’s hard drive to the hostname of the Mac and downloads the PMA client from the PMA Proxy Server and installs it. It removes the automatic logon for the Root user, removes the saved password for Root, runs a Repair Permissions on the Mac’s hard disk then disables the Root account and sets the Mac to reboot itself after 60 minutes. The Mac is given an hour before it reboots so that the PMA can download and apply its initial policies.

At this point, you will probably want to test the script to make sure that it works. This is why I suggested taking a snapshot of your Mac beforehand. Even if you do get it right, you still need to roll back your Mac to how it was before you ran the script.

Once the script has been tested, you will need to create an Automator workflow. Open the Automator app and create a new application. Go to the Utilities section and drag Shell Script to the pane on the right hand side.

Untitled 7

At this point, you have a choice: You can either paste your entire script into there and have it all run as a big block of code or you can drag multiple shell script blocks across and break your code up into sections. I would recommend the latter approach; it makes viewing the progress of your script a lot easier and if you make a mistake in your script blocks, it makes it easier to track where the error is. When you’re finished, save the workflow application in the Documents folder. I have uploaded an anonymised version of my workflow:¬†Login Script.

Finally, open System Preferences again and go to the Users and Groups section. Click on System Administrator and go to Login Items. Put the Automator workflow you created in as a login item. When the Mac logs in for the first time after its image is deployed, it will automatically run your workflow.

I’m sure you’re all thinking that I’m completely insane for suggesting that you do this but as I say, this is the only way I’ve found that reliably works. I tried using loginhooks and a login script set with a profile but those were infuriatingly unreliable. I considered editing the sudoers file to allow the workflow to work as Root without having to enter a password but I decided that was a long term security risk not worth taking. I have tried to minimise the risk of having Root log on automatically as much as possible; the desktop is only interactive for around 45-60 seconds before the screensaver kicks in and locks the machine out for those who don’t have the root password.¬†Even for those who do have the root password, the Root account¬†is only active for around 5-10 minutes until the workflow disables¬†it¬†after after¬†the Repair Disk Permissions command has finished.

Anyway, once that’s all done reboot the Mac into Target mode and connect it to your Mac running OS X Server.

Use the System Image Utility to create a Netboot image of your Mac with a workflow to deploy it.

There is a surprising lack of documentation on Internet about the System Image Utility. I suppose that’s because it’s so bare bones and that most people use other solutions such as DeployStudio to deploy their Macs.¬†I eventually managed to find some and this is what I’ve managed to cobble together.

On the Mac running OS X Server, open the Server utility and enter your username and password when prompted. When the OS X Server app finishes loading, go to the Tools menu and click on System Image Utility. This will open another app which will appear in your dock; if you see yourself using this app a lot, you can right click on it and tell it to stay in your dock.

siu 1

Anyway, once the System Image Utility loads click on the Customize button. That will bring up a workflow window similar to Automator’s.

SIU 2

The default workflow has two actions¬†in¬†it: Define Image Source and Create Image. Just using these will create a working image but it will not have any kind of automation; the Mac won’t partition its hard drive or name itself automatically. To get this to work, you need to add a few more actions.

There will be a floating window with the possible actions for the System Image Utility open. Find the following three actions and add them to the workflow between the Define Image Source and Create Image actions. Make sure that you add them in the following order:

  1. Partition Disk
  2. Enable Automated Installation
  3. Apply System Configuration Settings

You can now configure the workflow actions themselves.

For the Define Image Source action, change the Source option to the Firewire/Thunderbolt target drive.

For the Partition Disk action, choose the “1 Partition” option and check the “Partition the first disk found” and, optionally, “Display confirmation dialog before partitioning”. Checking the second box will give you a 30 second opportunity to create a custom partition scheme when you start the imaging process on your Mac clients. Choose a suitable name for the boot volume and make sure that the disk format is “Mac OS Extended (Journaled)”

For the Enable Automated Installation action, put the name of the volume that you want the OS to be installed to into the box and check the “Erase before installing” box. Change the main language if you don’t want your Macs to install in English.

The¬†Apply System Configuration Settings action is a little more complicated. This is the section which names your Macs. To do this, you need to provide a properly formatted text file with the Mac’s MAC address and its name. Each field is separated with a tab and there is no header line. Save the file somewhere (I’d suggest in your user’s Documents folder) and put the full path to the file including the file name into the “Apply computer name…” box. There is an option in this action which is also supposed to join your Mac to a directory server but I could never get this to work no matter what I tried so leave that one alone.

The last action is Create Image. Make sure that the Type is NetRestore and check the Include Recovery Partition box. You need to put something into the Installed Volume box but it doesn’t appear to matter what. Put a name for the image into the Image Name and Network Disk boxes and choose a destination to save the image to. I would suggest saving it directly to the /{volume}/Library/Netboot/NetbootSP0 folder as it will appear as a bootable image as soon as the image snapshot has been taken without you having to move or copy it to the correct location.

Once you’ve filled out the form, press the Save button to save your workflow then press Run. The System Image Utility will then generate your image ready for you to test. Do your best to make sure that you get all of this right; if you make any mistakes you will have to correct them and run the image creation workflow again, even if it is just a single setting or something in your script that’s wrong. The other problem with this is that if you add any new Macs to your estate you’ll have to update the text file with the Mac’s names and MAC addresses in and re-create the image again. This is why I put the “Name your Mac” section into the script.

Test the image

The next step now is to test your Netboot image. To do so, connect your Client Mac to the same network segment as your Server. Boot it to the desktop and open System Preferences. Go to to the Startup Disk pane and you should see the image that you just created as an option

boot

Click on it and press the Restart button. The Mac will boot into the installation environment and run through its workflow. When it’s finished, it will automatically log on as the Root user and run the login script that you created in a previous step.

Convert the image to a WIM and add it to your OSD Image Library

Once you’re satisfied that the image and the login script runs to your satisfaction, you need to add your image to the ConfigMgr¬†image library. Unfortunately, ConfigMgr doesn’t understand what an NBI is so we need to wrap it up into a WIM file.

To convert the image to a WIM file, first of all copy the NBI file to a suitable location on your PMA Proxy Server. Log onto the PMA Proxy using Remote Desktop and open the ConfigMgr client. Go to the Software Library workspace and Operating Systems then Operating System Images. Right click on Operating System Images and click on “Add Mac OS X Operating System Image”.

nbi convert

Click on the first browse button and go the location where you copied the NBI file to. This must be a local path, not a UNC.

Click on the second browse button and go to the share that you defined when you installed the Netboot agent on your PMA Proxy. This must be a UNC, not a local path. Press the Next button and wait patiently while the NBI image is wrapped up into a WIM file. When the process is finished, the image will be in your Operating System Images library. There is a minor bug here: If you click on a folder underneath the Image library, the image will still be added to the root of the library and not in the folder you selected. There’s nothing stopping you moving it afterwards but this did confuse me a little the first time I came across it. Once the image is added, you should copy it to a distribution point.

Advertise the image to your Macs

Nearly finished!

The final steps are to create a task sequence then deploy the task sequence to a collection. To create the task sequence, open the ConfigMgr console on a PC which has the Parallels console extension installed. Go to the Software Library workspace and Operating Systems. Under there, go to Task Sequences and right click on Task Sequences. Select “Create Task Sequence for Macs” and this will appear:

tasksequence

Put in a name for the task sequence then press the Browse button. After a small delay, a list of the available OS X images will appear. Choose the one that you want and press the Finish button. The task sequence will then appear in your sequence library but like with the images, it will appear in the root rather than in a specific folder. The only step left is to deploy the task sequence to a collection; the process for this is identical to the one for Windows PCs. I don’t know if it’s necessary but I always deploy the sequence to the Unknown Computers collection as well as the collections that the Macs sit in, just to be sure that new Macs get it as well.

Assuming that you have set up the Netboot server on the PMA Proxy properly, all of the Macs which are in the collection(s) you advertised the image to will have your image as a boot option. Good luck and have fun!

Bootnote

Despite me spending literally weeks writing this almost 4,000 word long blog post when I had the time and inclination to do so, it is worth¬†mentioning again that all of this is going to be obsolete very soon. The next version of the Parallels agent is going to support for proper task sequencing in it. My contact within Parallels tells me that they are mimicking Microsoft’s task sequence UI so that you can deploy software and settings during the build process and that there will be a task sequence wizard on the Mac side which will allow you to select a task sequence to run. I’m guessing (hoping!) that will be in the existing Parallels Application Portal where you can install optional applications from.

The Grand(ish) Experiment – Two weeks in

It’s been two weeks since I started using the Dell tablet in anger. You may be wondering where the promised updates have gone. Well, here’s one.

I think that the biggest question that I needed to answer was “Can one of these tablets cope as my primary workstation?”. The answer to that is an unequivocal “Yes”. I have been¬†using a Dell Venue 11 Pro 7139. This tablet¬†has a dual core 1.6GHz Core i5, 4GB RAM and a 128GB SSD in there. It is more powerful than a significant portion of the desktop machines in our college and frankly, I would be shocked if it couldn’t. The only thing I’d really want from it would be some more RAM, 4GB is a bit tight these days; sometimes the tablet would freeze while¬†I was using certain System Center components which can be a bit RAM hungry.

The dock that we received was a revision A00 dock which appears to have some issues when using it with multiple monitors. You may recall in my last blog post that I mentioned that I was having difficultly getting a DisplayPort to DVI adapter to work and that I thought I would need an Active one instead. Well, I ordered an Active one and that didn’t work either. This should be a supported configuration. After a bit of research, I found out that that Dell have put out an A01 revision of this dock which fixes these issues. It looks like Dell still have a load of stock of the A00s as the order number on the box was from the end of November and it’s a complete lottery as to which revision you’ll get when you order one. We ordered ours from BT Business so maybe you’d have more luck if you ordered from Dell directly.

This aside, the dock still worked with the DisplayPort to VGA adapter that we ordered so I have been using that to connect my second monitor. This has been OK but there has been the odd occasion where the tablet “forgets” that there is a monitor attached after the displays or the tablet wakes up ¬†after going to sleep. Sometimes telling Windows to reactivate the display works, sometimes you need to undock and redock the tablet to force it to start working again. However, I don’t think that this will be an issue for the people who are going to end up using them as most of them won’t have two monitors attached.

The DPI difference between the tablet’s display and the external monitors has been a source of annoyance for me. Each time I undocked the tablet to use elsewhere, I ended up logging it off and back on so that the desktop was usable. When I redocked afterwards, again I logged off and on so that everything wasn’t massive. Again, I don’t know if a teacher would find this to be an issue.

As a point of interest, when the new Windows 10 build (9926) appeared, I installed it on another 7139 I had lying around and the same resolution issues were still there.

There are still a few things for me to test; I’ve not brought it home to try yet and I haven’t had the opportunity to take it to many meetings. I haven’t tried it in a classroom scenario with visualisers and interactive whiteboards either which is something I will need to do.

The next step is to give a dock and tablet to a teacher and see what they make of it!

DCM Script – Detect Office Activation Status on Windows 7 and Activate if Unactivated

This one was a lot of fun and by “fun”, I mean a complete pain.

Recently, several of my helpdesk calls have been along the lines of “When I open Word, it says that it needs activating”. As I’d hope most people with more than 20 PCs to manage do, we use a Key Management Services (KMS) Server to activate all of our Windows and Office clients. Windows and Office are supposed to activate themselves either during the build process or very soon afterwards. However, the PCs need to phone back to the KMS server every 180 days to remain activated so either the PC hasn’t activated Office¬†during the build process or its activation ticket has expired and it hasn’t managed to get a new one.¬†Therefore, I needed a way to detect whether Office is activated on a computer and activate it if it wasn’t. Detect a state? Remediate it if it isn’t in a desired state? Hmm, this sounds like¬†something thats perfect for DCM! So I went a-looking, seeing what I could see.

First of all, this post is written for 64 bit machines which are running 32 bit Office. However, if you’re running 64 bit Office or 32 bit Office on 32 bit Windows, it’s just a matter of adjusting the paths for the Office VBS script accordingly.

At first, I hoped that I could use pure PowerShell to fix this. There is a very handy CIM instance called SoftwareLicensingProduct which lists the activation status for the Microsoft products installed on your computer. I thought a simple Powershell command like

Get-CimInstance SoftwareLicensingProduct -Filter "Description LIKE '%KMSCLIENT%'" | select ID, Description, LicenseStatus, Name, GenuineStatus

would give me a nice base to work from. On my Windows 8.1 machine, it does; it lists all of the KMS products on your PC and their activation statuses. However, on Windows 7, that CIM instance only lists the operating system, not Office and unfortunately Windows 7 is what is installed on the vast majority of the computers in my workplace. So that meant going back to the drawing board.

I needed another way to get the activation status for Office. From Office 2010 onwards, there is a VBS script called ospp.vbs. It needs to be run with the cscript interpreter as it’s purely command line rather than GUI driven.¬†There are several switches for it which perform operations like attempting an activation, clearing the activation status, setting the KMS server name and port and displaying the activation status of the various Office products. Running the following command:

cscript "C:\Program Files (x86)\Microsoft Office\Office 15\ospp.vbs" /dstatus

returned the following output on my PC with Office 2013 Pro Plus, Project 2013 Standard and Visio 2013 Pro installed on it:

---Processing--------------------------
---------------------------------------
SKU ID: 427a28d1-d17c-4abf-b717-32c780ba6f07
LICENSE NAME: Office 15, OfficeProjectStdVL_KMS_Client edition
LICENSE DESCRIPTION: Office 15, VOLUME_KMSCLIENT channel
LICENSE STATUS: ---LICENSED---
REMAINING GRACE: 177 days (256304 minute(s) before expiring)
Last 5 characters of installed product key: 8QHTT
Activation Type Configuration: ALL
KMS machine name from DNS: kmsserver.domain:1688
Activation Interval: 120 minutes
Renewal Interval: 10080 minutes
KMS host caching: Enabled
---------------------------------------
SKU ID: b322da9c-a2e2-4058-9e4e-f59a6970bd69
LICENSE NAME: Office 15, OfficeProPlusVL_KMS_Client edition
LICENSE DESCRIPTION: Office 15, VOLUME_KMSCLIENT channel
LICENSE STATUS: ---LICENSED---
REMAINING GRACE: 177 days (256304 minute(s) before expiring)
Last 5 characters of installed product key: GVGXT
Activation Type Configuration: ALL
KMS machine name from DNS: kmsserver.domain:1688
Activation Interval: 120 minutes
Renewal Interval: 10080 minutes
KMS host caching: Enabled
---------------------------------------
SKU ID: e13ac10e-75d0-4aff-a0cd-764982cf541c
LICENSE NAME: Office 15, OfficeVisioProVL_KMS_Client edition
LICENSE DESCRIPTION: Office 15, VOLUME_KMSCLIENT channel
LICENSE STATUS: ---LICENSED---
REMAINING GRACE: 177 days (256304 minute(s) before expiring)
Last 5 characters of installed product key: RM3B3
Activation Type Configuration: ALL
KMS machine name from DNS: kmsserver.domain:1688
Activation Interval: 120 minutes
Renewal Interval: 10080 minutes
KMS host caching: Enabled
---------------------------------------
---------------------------------------
---Exiting-----------------------------

Apart from the KMS Server, that output is verbatim. There is some very useful information in there; the product license, the activation information, the KMS server it’s using to activate, how long the activation has left. It’s great! Unfortunately it’s also a big lump of text which isn’t especially useful by itself.

At this point, I could have just created a package which ran

cscript "C:\Program Files (x86)\Microsoft Office\Office 15\ospp.vbs" /act

and called it a day. It certainly would have worked to an extent but I still wanted to use DCM. Using DCM would have been better because:

  • I can, in theory, set it to detect whether Office needs activating and only run the activation script if it’s not¬†whereas using a package with that command line in it will attempt activation of Office whether it needs activating or not
  • Using a package would be a set-once kind of affair, if Office decides to deactivate itself or fails reactivation after the KMS grace period expires, using a package won’t allow the script to re-run whereas using DCM, I can re-run the detection script every hour, every day, every week, every month or whatever

So I turned back to PowerShell and, eventually, came up with this:

C:\Windows\System32\cscript.exe 'C:\Program Files (x86)\Microsoft Office\Office15\OSPP.VBS' /dstatus | Out-File $env:temp\actstat.txt

$ActivationStatus = $($Things = $(Get-Content $env:temp\actstat.txt -raw) `
                            -replace ":"," =" `
                            -split "---------------------------------------" `
                            -notmatch "---Processing--------------------------" `
                            -notmatch "---Exiting-----------------------------"
                       $Things | ForEach-Object {
                       $Props = ConvertFrom-StringData -StringData ($_ -replace '\n-\s+')
                       New-Object psobject -Property $Props  | Select-Object "SKU ID", "LICENSE NAME", "LICENSE DESCRIPTION", "LICENSE STATUS"
        })

$Var = "Office Activated "
for ($i=0; $i -le $ActivationStatus.Count-2; $i++) {
    if ($ActivationStatus[$i]."LICENSE STATUS" -eq "---LICENSED---") {
        $Var = $Var + "OK "
        }

    else {
        $Var = $Var + "Bad "
        }
        }

If ($Var -like "*Bad*") {

    echo "Office Not Activated"
}
else
{
    echo "Office Activated"
}

That¬†script runs the Office activation VBScript and saves the output to a text file in the user’s TEMP directory. It reads the created text file and dumps the entire lot into a variable called¬†Things (I was experimenting, I couldn’t think of a better name once I had finished and hey, it worked! If it ain’t broke don’t fix it).¬†It converts the text file into a series of PowerShell objects using the series of dashes to separate them, replaces any colons with equals signs¬†and excludes the “Processing” and “Exiting” lines. It uses the¬†ConvertFrom-StringData command to add and populate properties on the objects which is why the colons needed replacing. It then selects the particular properties that I’m interested in. The whole lot gets put into a array¬†called¬†ActivationStatus which I can now use to do what I need to do.

The script creates another object called¬†Var¬†and pre-populates it with a bit of random text. It runs through all but the last object in the¬†ActivationStatus array (If you look at the text file output, you’ll see that the series of dashes appears twice at the end so my little routine creates a blank but not null object at the end of the array) and checks to see if the “LICENSE STATUS” property is equal to ‘— LICENSED —“. If so, it appends “OK ” onto the end of¬†Var, if not it adds “Bad “. Finally, the script looks at¬†Var and sees if the word “Bad” appears in it. If so, it echos back to ConfigMgr that Office is activated or not activated.

The remediation script looks like this:

cscript "C:\Program Files (x86)\Microsoft Office\Office 15\ospp.vbs" /act

Simple, no?

When you’ve created the Detection and Remediation scripts inside ConfigMgr, create a Compliance Rule which looks for a string called “Office Activated”. Then, as always, either create a new baseline and deploy it to a collection or add it to an existing one.

Controlling Dual Monitor Modes via the Command Line

This one is absurdly simple but pretty useful nevertheless.

At work, we have been getting a lot of calls recently where the teacher¬†has complained that their interactive whiteboards aren’t working properly and all that they can see on the projected surface is their wallpaper. I’m sure that anyone who has experience with this things will immediately see that of course, their whiteboards are fine and that the PCs are set to extend the desktop onto a secondary display rather than¬†clone it.

There are some big advantages to extending the desktop and I think that there are a few more IT literate teachers who have figured this out and decided to extend their desktop. However, what they’re also doing is forgetting to set it back when they’re finished and therefore upsetting the next teacher who goes to use the room. This of course generates a call to us and wastes everybody’s time.

I wanted to see if there was a way to control extending or cloning displays¬†using a script or a PowerShell command. I googled for a while and found a few third party programs which claimed they could do it but I found that they didn’t work that well. I eventually came across this¬†page which informed me about a program built into Windows versions from Windows called displayswitch.exe. It even has some command line switches!

displayswitch.exe /clone
displayswitch.exe /extend
displayswitch.exe /internal
displayswitch.exe /external

Those are pretty self explanatory I think! I then created a couple of GPOs with WMI filters which detect interactive whiteboards. Inside those GPOs are startup and logout scripts with the following command:

displayswitch.exe /clone

So each time a PC with an interactive whiteboard attached to it is started or logged out, it puts itself back into clone mode. Easy!

20140428164902_rsz_psb_topbannerheader_pmm

Managing Macs using System Center Configuration Manager – Part Two

In my previous¬†article, I described the agent that Microsoft have put into System Center Configuration Manager to manage Macs with. Overall, while I was happy to have some kind of management facility for our Macs I found it to be somewhat inadequate for our needs and I wished it was better. I also mentioned that Parallels, the company behind the famous Parallels Desktop virtualisation package for the Mac, contacted us and invited us to try out their plugin for the Mac. This article will explain what the Parallels agent is capable of and how well it works and how stable it’s proven to be since we’ve had it installed.

Parallels Mac Management Agent for ConfigMgr

The Parallels agent (PMA) is an ISV proxy for ConfigMgr. It acts as a bridge between your Macs and the management point in your ConfigMgr infrastructure. The agent doesn’t need HTTPS to be enabled on your ConfigMgr infrastructure, ConfigMgr sees Macs as full clients. The Parallels agent fills in a lot of the gaps which the native Microsoft agent has such as:

  1. A graphical and scriptable installer for the agent
  2. A Software Center-like application which lists available and installed software. Users can also elect to install published software if desired.
  3. Support for optional and required software installations
  4. Operating System Deployment
  5. The ability to deploy .mobileconfig files through DCM
  6. A VNC client launchable through the ConfigMgr console so you can control your Macs remotely
  7. Auto-enrollment of Macs in your enterprise
  8. Support for FileVault and storage of FileVault keys

It supports almost everything else that the native ConfigMgr client does and it doesn’t require HTTPS to be turned on across your infrastructure. In addition, if you use Parallels Desktop for Mac Enterprise Edition, you can use the plugin to manage VMs.

Installation

The PMA requires a Windows server to run on. In theory, you can have the PMA installed on the server hosting your MP or it can live on a separate server. There are separate versions of the PMA available for ConfigMgr 2007 and 2012/2012 SP1/2012 R2.

Earlier versions of the PMA didn’t support HTTPS infrastructures properly so you needed to have at least one MP and one DP running in HTTP mode. However, the latest version supports HTTPS across the board. However, you do need to have at least one DP running in anonymous mode for the Macs to download from.

IIS needs to be installed on the server along with WebDAV and BITS. Full and concise instructions are included so I won’t go over the process here. Anybody who has installed a ConfigMgr infrastructure will find it easy enough.

If you are using the OSD component, it needs to be installed on a server with a PXE enabled DP. If you have multiple subnets and/or VLANs, you will need an IP helper pointing at the server for the Macs to find it.

Software Deployment

The PMA supports two methods of deploying software. You can use either Packages or Applications.

Generally speaking, there are three ways to install a piece of software on a Mac, not counting the App Store:

  1. You can have an archive file (Usually DMG) with a .app bundle in to be copied to your /Applications or ~/Applications folder
  2. You can have an archive file with a PKG or MPKG installer to install your application
  3. You can install from a PKG or MPKG.

Installing using Packages

Unlike the Microsoft agent, you don’t need to repackage your software to deploy it with the PMA. To avoid doing so, you can deploy them using legacy style Packages. To deploy a piece of software using ConfigMgr Packages, you need to create a Package in the same way as you would for Windows software. You copy it to a Windows file share. You need to create a Program inside the package with a command line to install the package. Using¬†the three above scenarios, the command lines would look like this:

  1. :Firefox 19.0.2.dmg/Firefox.app:/Applications:
  2. :iTunes11.0.2.dmg/Install iTunes.pkg::
  3. install.pkg

The first command tells the PMA to open the DMG and copies the Firefox.app bundle in the DMG to the /Applications folder. The second tells the PMA to open the DMG and execute the .pkg file with a switch to install it silently. The third runs an arbitrary command.

Once the package and the program inside the package have been created, you distribute to a DP and deploy it to a collection as standard.

Deploying software using this method is functional and it works nicely. The disadvantage is that there is no way to tell if a deployment has worked without either delving through the logs or looking in the Applications folder. If you deploy the Package to a collection, the PMA will try to install the Package on the Mac whether it’s already on there or not.

Installing using Applications

As of version 3.0 of the PMA, Parallels have started supporting ConfigMgr Applications as well as Packages. Parallels are using Microsoft’s cmmac file format to achieve this. This means that you need to repackage applications and add them to the ConfigMgr console using the same method as you do for the native ConfigMgr client. This is a bit of a pain but the benefits that doing so brings make it worthwhile.¬†As with the Microsoft client, there are detection rules built into the Application meaning that the Parallels client can check to see if a piece of software is deployed on the Mac before it attempts to deploy it. If it is already there, it gets marked as installed and skips the installation.

It also brings this to the table:

pap

That is the Parallels Application Portal. Much like Software Center on Windows, this gives you a list of software that has been allocated to the Mac. It allows for optional and required installations. If a deployment is marked as optional, the software is listed with a nice Install button next to it.

As with the Microsoft agent, you need to be careful with the detection¬†rules that you put into your Applications. The PMA¬†runs a scan of /Applications and /Library folders looking for info.plist files. It extracts the application name and version from those PLISTs and adds them to an index. It then looks at the detection rules in the Application and compares them to the index that it builds. If there’s a match, it marks them as installed. If you don’t get the detection rules right, the PMA won’t¬†detect the application even if it has been installed properly and then it¬†eventually tries to reinstall it. I spent a very interesting afternoon chasing that one down. There are also some applications which don’t have info.plist files or get installed in strange places. The latest Java update doesn’t have an info.plist, it has an alias to another PLIST file called info.plist instead. The PMA didn’t pick that one up.

Operating System Deployment

Quite impressively, Parallels have managed to get an operating system deployment facility built into the PMA. It’s basic but it works.

First of all, you need to create an base image for your Macs and take a snapshot of it using the System Image Utility found in OS X at¬†/System/Library/CoreServices/. You can create custom workflows in this utility to help you blank the hard disk before deployment and to automate the process. Make sure you tell it to make a NetRestore image, not a NetBoot image like I first did. Once you’ve done that, you tell it where you want to save¬†your image and set it on its way. The end result is an NBI file which is a bundle with your system image and a bootstrap.

You then copy the resulting NBI file onto a PC or server with the ConfigMgr console and Parallels console addon installed. Once it’s on there, open the console and go to the Software Library workspace. Go to Operating Systems, right click on Operating System Images and choose Add Mac OS X Operating System Image. A wizard appears which asks you to point at the NBI file you generated and then to a network share where it creates a WIM file for ConfigMgr.

add image

Once the WIM¬†has been generated, it adds itself to the console but one quirk I have noticed is that if you try to create it in a folder, it doesn’t go in there. It gets placed in the root instead.¬†You¬†can¬†move it manually afterwards so it’s not a huge issue.

The next step is to create a task sequence. There is a custom wizard to this too, you need to right click on Task Sequences under Operating System Deployment in the Software Library workspace then choose Create Task Sequence for Macs.

task sequence

You give the task sequence a name, choose the image that you want to use and press the Finish button. Again, if you’re using folders to organise your task sequences and you try to create the task sequence in a folder, it will get placed in the root of the structure rather than in the folder that you tried to create it in. You can move it if you wish.

From there, it’s pretty much standard ConfigMgr. You need to distribute the image to a DP and publish the task sequence to a collection. The task sequence then appears to the Macs in that collection as a standard Netboot image with the title that you gave to it. You can access it the usual way, either through the Startup Disk pane in System Preferences or by holding down the option key on startup.

boot disk

Unfortunately, what it doesn’t do is allow for any kind of automatic, post image deployment sequences. Although in theory the System Image Utility is supposed to support custom workflows which allow software installations and the running of scripts, I haven’t managed to get it to deploy the agent automatically. I have therefore created a script which the admin deploying the Mac needs to run which (amongst other things) names the Mac and installs the PMA. From what Parallels tell me, this is being worked on.

DCM РScripts and Profiles

The PMA supports the usage of DCM Bash scripts in the same way as the Microsoft agent does. There isn’t much to say about this, it works and it’s a useful thing to have. The other way of getting settings onto Macs with the PMA is via mobileconfig files generated either by Profile Manager in OS X Server or by a generator¬†built into the ConfigMgr console addin. The generator¬†looks like this:

profile

Look familiar? Unfortunately there aren’t all of the options here that are in Profile Manager so if you want to do something more advanced than what’s on offer here, you still need a copy of OS X Server and Profile Manager to generate the profile.

To deploy a mobileconfig file using the PMA, you need to go to the Assets and Compliance workspace, go to Compliance Settings and right click on Configuration Items. Go to Create Parallels Configuration Item then to Mac OS X Configuration Profile.

configprofile

You then give the configuration item a name, decide whether it’s a User or System profile and point the wizard at the mobileconfig file generated by Profile Manager. Once you’ve done that, there is a new configuration item in your console which can be added to a DCM baseline and deployed to a collection.

I have used this facility for various purposes; for configuring Airport, for joining the Mac to the AD domain, for setting up the user’s desktop settings and wallpaper, setting up Time Machine and for more besides. It’s a great facility to have and rather more user friendly than delving through PLISTS to find settings.

Other features РVNC Client, Auto-enrolment and  Inventory

Well, these do what they say on the tin. We have a VNC client:

vnc

It takes an inventory:

mac inventory

It has the ability to detect Macs on your network and automatically install the agent on them. They all work. There isn’t really much else to be said.

How well does it work?

So clearly, the PMA has far more features than the Microsoft agent does but a piece of software can have all the features in the world and still be useless if it isn’t stable. In this regard, I am delighted to say that the Parallels agent has been rock solid for us. It has been continually improved and has had feature after feature added. It doesn’t quite make a Mac a first class citizen on a Windows network but it comes very close and going by the way that Parallels have improved the product over the last two years, I’m confident that the gap will close. Parallels have been a lot quicker in getting new versions of OS X supported with their plugin too, it already has support for Yosemite.

It hasn’t been entirely problem free but when I’ve contacted Parallels Support, they’ve been quick and efficient and they’ve got the problem resolved. Most problems that I’ve come across I’ve managed to solve myself with a bit of thought.

Although Parallels claim that it the PMA can be installed on the same server as your management point, I couldn’t get it to work when I did this. I ended up putting it on its own hardware. This was with v1.0 of the product though, we’re now on v3.1 so they may have improved that since then.

Having the PMA has also meant that I no longer need a Magic Triangle set up to support and configure my OS X clients. I don’t need Profile Manager¬†or Workgroup Manger¬†to deploy settings, I don’t need OS X server or DeployStudio to deploy images. The only thing I need OS X Server for is Profile Manager to generate the profiles and (with the arrival of Yosemite) the System Image Utility.

The only real downside to the PMA is that it’s expensive and that you have to pay full price for it annually. That may be hard to justify if you’ve already spent a fortune on a System Center license.

Conclusion

So lets do a quick advantage/disadvantage comparison:

Microsoft client advantages:

  • Native client so no additional cost
  • Support from Microsoft

Microsoft client disadvantages:

  • Sparse feature set
  • Required installs only
  • Complicated DCM
  • Takes a long time to install multiple applications
  • Requires HTTPS
  • Slow to support new versions of OS X
  • No visible status indicators, next to impossible to see what’s going on.

Parallels client advantages

  • Includes lots of additional features, brings level of management of Macs to near parity of Windows machines
  • Optional and user initiated installs supported
  • Software Center equivalent and a System Preferences pane to check status of agent. Very thorough logs to check on what the agent is doing are available too.
  • OSD
  • Doesn’t require HTTPS
  • Supports SCCM 2007
  • Much easier to deploy settings by using mobileconfig files

Parallels client disadvantages

  • Expensive
  • Requires an additional DP
  • Probably requires an additional server to install the proxy

They’re as good as each other when it comes to running DCM scripts and taking inventories. So the answer is simple: If you can afford it, get the Parallels Management Agent. If I were the head of the System Center division at Microsoft, I’d be going cap in hand to Satya Nadella and telling him to drive all the money to Parallels HQ to acquire their¬†product from them. Frankly, it puts Microsoft’s own efforts to shame.

DCM Script – Disable On-Board Sound Card if USB Device is Attached

The first script in my new library is one that I am quite proud of as it was the first that I created¬†to solve a relatively complex problem. It came to be because of the Music department at the college that I work at. The PCs in their department have external USB sound cards for students to plug MIDI instruments into and other such things (Hell, I’m not a musician, I don’t understand what they’re for exactly!). The problem was that Sonar, the music sequencing software that they use, was giving them trouble when both the on-board audio and the USB device was enabled. They therefore wanted me to disable the on-board sound card in the machines so that it wouldn’t happen again.

I could have gone to each of their PCs and just disabled the onboard sound in the BIOS or in Windows but that would be a short term fix; if the PC got replaced or rebuilt the person doing that would have to remember to disable it again. I therefore wrote this:


$SoundDevices = Get-CimInstance Win32_SoundDevice

if ($SoundDevices.DeviceID -like "USB*")
    {
         #USB Sound Card detected, will now check to see if on-board HDAUDIO is still active
         $HDAudio = Get-CimInstance Win32_SoundDevice -Filter 'DeviceID LIKE "HDAUDIO%"'
         $AudioStatus = $HDAudio.StatusInfo
         If ($AudioStatus -eq '3')
             {
                 #On-board still active, need to disable
                 echo "USB Audio detected, on-board audio needs to be disabled"
             }
         else
             {
                 #USB detected, on-board disabled
                 echo "OK"
             }
    }
else
    {
        #No USB, onboard sound to be left alone
        echo "OK"
    }

 

This script queries WMI to find out what sound devices are installed in the machine. If it detects one with USB in the device string, it goes on to see if the onboard HDAUDIO device is enabled. If it is, it sends a string back to ConfigMgr saying that remediation needs to happen. If the on-board is disabled it sends “OK” back to ConfigMgr. If there is no USB audio device installed at all, it sends “OK back to ConfigMgr.

The remediation script is quite simple. Since these are Windows 7 machines, there is no PowerShell way of managing hardware. I therefore had to use the old DEVCON command. The remediation script therefore looks like this:

%Path_to_file%\devcon.exe disable HDAUDIO\*

That disables any device with HDAUDIO at the beginning of its device ID.

Set the compliance rule on the DCM script to look for a string that says “OK” and that’s it. Then add the rule to a existing baseline or create a new one and deploy it to a collection.

0763.sccm2012

Managing Macs using System Center Configuration Manager – Part One

This post is about one of my favourite subjects, namely Configuration Manager, referred to hereafter as ConfigMgr . If you don’t care about the intricacies of desktop management, I suggest you look away now cos this ain’t gonna interest you!

Before I get too far into this post, I should mention that I’ve written about this subject before on my blog at EduGeek. The content here therefore isn’t that new but I’ve rewritten some if it and hopefully improved on it a little too. I will also say that all of this was tried almost two years ago now so chances are that things have changed a little with ConfigMgr 2012 R2. From what I understand though, most of what I’ve written here is still accurate.

Anyway, I spend a lot of time at work using ConfigMgr manage the computers on our Windows network. We use it to almost its full extent; we use it for software and update deployment, operating system deployment, for auditing software usage, for configuration of workstations using DCM and a fair bit more besides.

As well as having more than 1500 Windows PCs, laptops and servers, I also have around 80 Macs to manage as well. To put it mildly, they were a nuisance. They were essentially unmanaged; whenever an update or a piece of software came along, we had to go to each individual Mac and install it by hand. The remote administration tools that we were using (Apple Remote Desktop and Profile Manager) were¬†woefully inadequate. ARD requires far too much interaction and hasn’t had any significant update since Leopard was released. Profile Manager does an OK job of pushing down settings but for software management, it assumes that the Macs are personal devices getting all of their software from the App Store. That’s not really good enough. We were desperate to find something better.

We had been using ConfigMgr to manage our Windows PCs for a couple of years by that point and we had recently upgraded to 2012 SP1 which featured Mac management for the first time. We figured that we may as well give it a go and see what it was like. This is what I found out.

First of all, ConfigMgr treats Mac clients as Mobile devices so this means that you have to set up an HTTPS infrastructure and install an enrolment point for your Macs to talk to. Your management point needs to talk HTTPs as do your distribution points. That also means that you need to allocate certificates to your PXE points and task sequence boot media if you want them to talk to the rest of your infrastructure.

Once you have all of this set up, you need to enrol your Macs. Bear in mind that I looked at this when ConfigMgr 2012 SP1 was the current version. I understand that the process has changed a little in 2012 R2.

First of all, you need to download the Mac Management Tools from here for 2012 SP1¬†¬†and here for 2012 R2.¬†This gets you an MSI file which you need to install on your Windows PC. That MSI file contains a DMG file which you need to copy to your Mac. In turn, that DMG file contains the installer for the Mac client, the utility for enrolling your Macs in ConfigMgr and an application repackager. You have to install the client first of all from an elevated terminal session. Once that’s installed, you need to run another command to enrol your Mac into ConfigMgr. Assuming that you get this right, it will download a certificate and you’re good to go. When I was setting up the Macs to use this, I found a very good blog post¬†by James Bannan¬†which goes into a lot more detail.

Once your Mac has been enrolled, you will want to start doing something useful with it. At the moment, the Microsoft client has the following abilities:

  1. You can deploy software
  2. You can install operating system updates using the software deployment mechanism
  3. You can check and change settings by using DCM to modify PLIST files
  4. You can check and change settings by using DCM and Bash scripts to return values and make changes
  5. The agent takes an inventory of the software and hardware installed on your Mac and uploads it to your management point.

Deployment of Software and Updates

Deploying software on the Macs is broadly similar to doing the same process on Windows computers; you need to add the software to ConfigMgr as an application, create a deployment type with some detection rules, distribute the software to a DP and deploy the software to an application. The one difference is that you need to repackage the software into a format that ConfigMgr understands. It has a specific format for Mac software called “cmmac”. This is essentially a refactored ZIP file with either a .app, a .pkg or a .mpkg with an¬†XML file which has an inventory of the ZIP, installation instructions and some predefined detection rules. I don’t want to make this already long post any longer than it needs to be so I’ll link to Mr. Bannan’s blog again which has a very good rundown of the entire process.

Changing settings using PLIST files

This isn’t the simplest of processes but it is quite effective. The first step is to open the ConfigMgr console on your Windows PC and go to the Assets and Compliance workspace. From there, go to Overview then Compliance Settings then¬†Configuration Items.¬†Right click and click¬†Create Configuration Item. This will bring up the following window:

Untitled

This example is going to set a proxy server against a network interface so I have named it appropriately and given it a description. Make sure that you set the Configuration Item Type to Mac OS X. Press the Next button

os selection

The next box lets you target your setting to specific versions of OS X. This screenshot was taken nearly two years ago when Microsoft hadn’t got around to adding Mountain Lion support.¬†The current version supports up to and including Mavericks but not Yosemite (yet). Choose a specific version or not depending on what you need and press Next.

create setting

You then need to tell ConfigMgr what PLIST file you’re editing and which key you want to change. You also need to tell it if the key is a string, a number, a boolean value etc. Once you’ve done that, change to the Compliance Rules tab

edit rule

You need to add a rule for each setting that you’re changing. The one in the example above is setting the network name of the HTTP proxy server for the Ethernet interface on the Mac. To complete¬†this example, you’d also need to set one for the HTTPS proxy, the port number and any proxy exceptions. Make sure that the Remediate is checked on any rules that you create and finish the wizard.

Once your compliance rule is completed, you will need to create a DCM baseline or add it to an existing baseline and deploy that baseline to a collection. I’m not going to go through the process here as it’s largely identical to doing it for a Windows PC.

Changing settings using Bash Scripts

This is probably the more powerful way of using DCM as you’re not relying purely on PLIST files to make your changes. If you can detect and remediate the setting that you want to change by¬†using Bash, you can use the script here. This could be a setting in an XML file, a config file somewhere, a PLIST etc. I’m sure you get the idea. The process for creating a compliant rule using a script is largely similar to creating one for a PLIST and even more similar to creating one for a Windows machine. When you get to the third window, choose Bash Script in the setting type instead of¬†Preference File. You get the opportunity to input two scripts; one to detect the setting and one to change it.

System Inventory

Again, this works in the same manner as it does for Windows machines, albeit not quite as detailed. At the very least, you get a list of the hardware and software installed on the machine and the agent keeps track of any changes made. Asset Intelligence and Software Metering¬†isn’t supported however.

What can’t it do?

  1. OSD
  2. Remote Control
  3. Asset Intelligence
  4. Antivirus monitoring (Although it will deploy SCEP for Mac happily enough)
  5. Software Metering
  6. Power Management (Not easily anyway)

Results

So I’ve covered how¬†it all works. The question that you may be asking now is “How¬†well does it work?”. The answer two years ago was “It works OK… ish. Maybe”. I shall try to explain.

The whole thing feels very much like a v0.1 beta rather than proper release software. It’s functional up to a point but there are some very rough edges and the functionality is nowhere near as strong on the Mac (and presumably Linux too) is it is on a Windows PC.

For starters, you can only deploy applications to machines and not to users. You can’t have optional installs. There is no Software Center so you can’t easily see what software has been deployed and what software is supposed to be deployed. When the agent detects a deployment, it comes up with a sixty¬†minute countdown, the length of which can’t (or couldn’t) be changed. You can tell the Mac to start deployment when you see the countdown but if you’re deploying (say) six pieces of software and you leave the Macs unattended, the countdown comes up, expires, installs the software then the next countdown comes up, expires, installs the software and so on. It can take hours for multiple deployments to finish if you’re not paying attention.

I also found that the detection of deployments was rather erratic too. Just like with Applications for Windows PCs, there are detection rules which ConfigMgr uses to determine whether a piece of software is installed on the Mac or not. The ConfigMgr client is supposed to use the detection rules to detect whether the Application is installed or not and skip installation of deployed applications if it detects that’s it’s already present. Unfortunately the detection process seemed rather erratic and our Macs had a habit of repeatedly trying to install software that was already there. The process then fails because the installer detects that the software is there already and throws an error. The process then restarts because ConfigMgr still thinks it’s not there. This tended to happen with more complex Applications which use PKG installers to deploy rather than Applications which copy .app files. I do have a theory as to why this happens but I noticed this about two years later. When you repackage the application using CMAppUtil, it automatically generates the detection rules for you. With PKG installers, it puts a lot in there. I think that maybe it puts too many in there so it’s looking for a load of settings it can’t detect despite the software being present. Unfortunately I haven’t managed to test the theory but it makes sense to me.

Another gotcha that I’ve found with the repackager is that sometimes, it gets the installation command wrong, especially when you run it on a Mac with more than one operating system installed on it. It sometimes gets the path to install to wrong, necessitating a change in your installation command line.

DCM works nicely but finding the PLIST files or the setting that you want to change via Bash can be troublesome. That said, it’s no worse than trawling through the registry or finding an obscure PowerShell command to do what you want on a Windows machine.

Rather mysteriously, Microsoft didn’t include a remote control agent with this. Considering that a VNC daemon is baked into all versions of OS X, this would be trivial to implement,

The real bugbear that my team and I had with the Microsoft client is that Microsoft were very slow to implement support for new versions of OS X. As I’m sure you know, Apple have been working on a yearly release model for major versions of OS X since they released Lion. Microsoft didn’t support Mountain Lion for six full months after Apple had released it on the App store. The delay for Mavericks support wasn’t much better and Yosemite isn’t supported at all right now. It wouldn’t be so bad if it were a case of “Oh, it’s not supported but it’ll probably work”. Unless there is explicit support for the OS X version in the client, it won’t.

So in conclusion, the Microsoft client is better than nothing but it’s not that good either. When my friend and colleague Robert wrote a brief piece about this subject on his blog, he got a message from the lovely people at Parallels telling him about a plugin they were writing for SCCM which also happens manage Macs. Stay tuned for Part Two of this article.

*Update*

Part two of this article is now up. If you want to see how this story ends, please click here

%d bloggers like this: