DCM Script – Detect and disable Intel Graphics Card Service

As I imagine the majority of corporate PCs do these days, all of the computers at my workplace have integrated Intel graphics chipsets. And why not, for a business PC they’re perfectly adequate; their 3D acceleration is good enough for Aero on Windows 7 and for anything else the vast majority of users need.

However, there is a rather… annoying feature of the drivers which I like to suppress. The driver puts an application into the System Notification Area which makes it easy for people to mess around with graphical settings and which lets them change the orientation of the screen by pressing certain key combinations. I’m sure that for a lot of corporate settings this isn’t too much of a problem but for a school or college it generates a lot of helpdesk calls because the little sods darlings like hitting those keys and turning the screens upsidedown.

Anyway, this DCM script detects whether the service is running and kills and disables it if it is

[powershell]$IntelGFXService = Get-Service | Where-Object {$_.Name -like ‘igfx*’}

if ($IntelGFXService -ne $null) {

$IntelGFXServiceName = $IntelGFXService.Name
$IntelFGXStartupMode = Get-CimInstance Win32_Service -Filter "Name=’$IntelGFXServiceName’"
$IntelGFXService.Status
$IntelFGXStartupMode.StartMode

if ($IntelGFXService.Status -eq "Running" -and $IntelFGXStartupMode.StartMode -eq "Auto")
{
echo "Service Started, Startmode Automatic"
}
elseif ($IntelGFXService.Status -eq "Stopped" -and $IntelFGXStartupMode.StartMode -eq "Auto")
{
echo "Service Stopped, Startmode automatic"
}
elseif ($IntelGFXService.Status -eq "Running" -and $IntelFGXStartupMode.StartMode -eq "Disabled")
{
echo "Service Started, Startmode Disabled"
}
else
{
echo "all disabled"
}

}
else
{
echo "all disabled"
}[/powershell]

That checks the status of the service and reports the status back to ConfigMgr. The remediation script looks like this:

[powershell]$IntelGFXService = Get-Service | Where-Object {$_.Name -like ‘igfx*’}

Set-Service -Name $IntelGFXService.Name -StartupType Disabled
Stop-Service -Name $IntelGFXService.Name
get-process igfx* | stop-process[/powershell]

That stops the service, disables it and kills any relevant processes running alongside the service.

Set the compliance rule to look for a string called “all disabled” and apply the rule to either a new or existing baseline. That’s it for today!

Mac Servers – What I’m Doing

Hopefully this should be the last post about Macs for the time being! I don’t know why I’ve written so many recently, I guess it’s just because the bloody things take up so much of my time at work. Don’t get me wrong, I like Macs a lot for my own purposes. I own two, my girlfriend has one, we both have iPhones, I have an Apple TV and an AirPort Extreme router. I even subscribed to MobileMe when you had to pay for it. It’s just that professionally, they’re a pain in the arse and if you really want I’ll put that on a certificate for you to put on your wall.

Anyway, my last post was originally supposed to be about the Apple server solution that we’re using at work. However, it kind of turned into a rant about how Apple has abandoned the server market. I stand by the post entirely but I thought I’d try to write the post I was originally intending to write!

A few months ago I got a call from the technician in the Media Studies department who looks after their Macs. As far as he could see, the Mac had stopped working entirely and he was getting into a bit of a panic about it (The Mac was actually fine, it had just buggered up its own disk permissions preventing anything from launching). The reason he was panicking was because he thought that all of the student work stored on it had been lost. These Macs are used for editing video using Final Cut Pro; because of the demands that video editing puts on a computer’s storage subsystem, it is totally impractical to edit video over a network on one computer let alone 40 at once so therefore the student has to store any work that he or she does on the Mac itself. If that Mac fails then the user kisses goodbye to all of the work done that year. That wasn’t acceptable but he and the college had put up with it up until that point. He wanted to know if there was a way to back them up so that if one does go kaboom, we have a way of recovering their work.

This turned into a fairly interesting project as I had to investigate viable solutions to achieve this. I came up with four:

  1. Backing the Macs up using an external hard drive and Time Machine
  2. Backing the Macs up using a server running OS X Server and Time Machine
  3. Backing the Macs up using a server from another manufacturer using some other piece of backup software
  4. Backing the Macs up to a cloud provider

First of all, I investigated and completely discounted the cloud solution. We would need in the order of 20TB of space and a huge amount of bandwidth from the cloud provider for the backups to work. The Macs would need to be left on at night as there would be no way we’d be able to back then up during the day and maintain normal internet operations. It all ended up costing too much money, if we had gone for a cloud solution we would have spent as much money in six months as it would have cost for a fairly meaty server.

There was also quite some thought put into using USB disks and Time Machine to back them up. This certainly would have worked, it would have been nice and simple and relatively cheap. Unfortunately there were some big downsides too. Securing them would have been almost impossible, it would be far too easy for someone to disconnect one and walk off with it. If we wanted any decent kind of speed and capacity on them, we probably would have had to of used enclosures with 3.5″ drives in them which would have meant another power supply to plug in. Finally, I couldn’t see a way to stop students just using them as an additional space to store their stuff on completely negating the point of them in the first place.

So that left having a network storage for the Macs to store their backups on. First of all, I looked at Dell (Other server vendors are available) to see how much a suitable server would cost. Dell’s range of servers with lots of 3.5″ drives is frustratingly small but eventually I managed to spec a T620 with 2x1TB and 8x4TB drives with a quad core CPU and 16GB RAM, a half decent RAID controller and a three year warranty for about £7500 retail. Add on top the cost of a suitable backup daemon and it would cost somewhere in the region of £9000-£9500. Truth be told, that probably would have been my preferred option but a couple of things kept me from recommending it. First of all, £9500 is a lot of money! Secondly, although Apple have been deprecating AFP and promoting SMB v2/v3 with Mavericks and Yosemite, Apple’s implementations of SMB have not been the best since they started migrating away from SAMBA. AFP generally is still the fastest and most reliable networking protocol that a Mac can use. With this in mind, I decided to have a look at what we could get from Apple.

The only Server in Apple’s range at the time was the Mac Mini Server. This had 8GB of RAM installed, a quad core i7 CPU at roughly 2.6GHz and 2x1TB hard disks. Obviously this would be pretty useless as a backup server on its own but that Thunderbolt port made for some interesting possibilities.

At first, I looked at a Sonnet enclosure which put the Mac into a 1U mount and had a Thunderbolt to PCI Express bridge. I thought that I could put a PCIe RAID controller in there and attach that to some kind of external SAS JBOD array and then daisychain a Thunderbolt 10Gbe ethernet card from that. I’m sure it would have worked but it would have been incredibly messy.

Nevertheless, it was going to be my suggested Apple solution until I saw an advert on a website somewhere which lead me to HighPoint’s website. I remembered HighPoint from days of old when they sold dodgy software RAID controllers on Abit and Gigabyte motherboards. A Thunderbolt storage enclosure that they promote on their site intrigued me. It was a 24 disk enclosure with a Thunderbolt to PCIe bridge with three PCIe slots in it and, crucially, a bay in the back for a Mac Mini to be mounted. Perfect! One box, no daisy chaining, no mess.

Googling the part code lead me to a local reseller called Span where I found out that what HighPoint were selling was in fact a rebadged NetStor box. This made me happy as I didn’t want to be dealing with esoteric HighPoint driver issues should I buy one. Netstor in turn recommended an Areca ARC-1882ix-24 RAID controller to drive the disks in the array and an ATTO Fastframe NS12 10Gbe SFP+ network card to connect it to the core network. Both of these brands are reasonably well known and well supported on the Mac. We also put in 8x4TB Western Digital Reds into the enclosure. I knew that the Reds probably weren’t the best solution for this but considering how much more per drive the Blues and the Blacks cost, we decided to take the risk. The cost of this bundle including a Mac Mini Server was quoted at less than £5000. Since OS X Server has a facility to allow Time Machine on client Macs to back up to it, there would have been no additional cost for a backup agent.

Considering how squeezed for money the public sector is at the moment, the Mac Mini Server plus Thunderbolt array was the chosen solution. We placed the order with Span for the storage and networking components and an order for a Mac Mini Server from our favourite Apple resellers, Toucan.

Putting it all together was trivial, it was just like assembling a PC. Screw the hard drives into their sledges, put the cards in the slots, put the Mac Mini into the mount at the back and that’s it. After I connected the array to the Mac, I had to install the kexts for the RAID controller and the NIC and in no time at all, we had a working backup server sitting on our network. In terms of CPU and memory performance it knocked the spots off our 2010 Xserve which surprised me a bit. We created a RAID 6 array giving us 24TB of usable storage or 22.35TiB if you want to use the modern terminology. The RAID array benchmarked at about 800MB/sec read and 680MB/sec write which is more than adequate for backups. Network performance was also considerably better than using the on-board NIC too. Not as good as having a 10Gbe card in its own PCIe slot but you can’t have everything. The array is quite intelligent in that it powers on and off with the Mac Mini; you don’t have to remember to power the array up before the Mac like you have to with external SAS arrays.

I know that it’s not an ideal solution. There is a part of me which finds the whole idea of using a Mac Mini as a server rather abhorrent. At the same time, it was the best value solution and the hardware works very well. The only thing that I have reservations about is Time Machine. It seems to work OK-ish so far but I’m not sure how reliable it will be in the long term. However I’m going to see how it goes.

Mac Servers in a Post Xserve World

About three years ago, Apple discontinued the Xserve line of servers. This presented a problem. While the Xserve never was top tier hardware, it was at least designed to go into a server room; you could rack mount it, it had proper ILO and it had redundant power supplies. You would never run an Apache farm on the things but along with the Xserve RAID and similar units from Promise and Active, it made a pretty good storage server for your Macs and it was commonly used to act as an Open Directory Server and a Workgroup Manager server to manage them too.

Discontinuing it was a blow for the Enterprise sector who had came to rely on the things as Apple didn’t really produce a suitable replacement. The only “servers” left in the line were the Mac Pro Server and the Mac Mini server. The only real difference between the Server lines and their peers were that the Servers came with an additional hard drive and a copy of OS X Server preinstalled. The Mac Mini Server was underpowered, didn’t have redundant PSUs, it only had one network interface, it didn’t have ILO and it couldn’t be racked without a third party adapter. The Mac Pro was a bit better in terms of spec, it at least had two network ports and in terms of hardware it was pretty much identical internally to its contemporary Xserve so it could at least do everything an Xserve could do. However, it couldn’t be laid down in a cabinet as it was too tall so Apple suggested you stood two side by side on a shelf. That effectively meant that you had to use 10U to house four CPU sockets and eight hard drives. Not a very efficient use of space and the things still didn’t come with ILO or redundant power supplies and it was hideously expensive, even more so than the Xserve. It also didn’t help that Apple didn’t update the Mac Pro for a very long time and they were getting rapidly outclassed by contemporary hardware from other manufacturers, both in terms of hardware and price.

Things improved somewhat when Thunderbolt enabled Mac Mini Servers came onto the scene. They came with additional RAM which could be expanded, an extra hard drive and another two CPU cores. Thunderbolt is essentially an externally presented pair of PCI Express lanes. It gives you a bi-directional interface providing 10Gbps of bandwidth to external peripherals. Companies like Sonnet and NetStor started manufacturing rack mountable enclosures into which you could put one or more Mac Minis. A lot of them included ThunderBolt to PCI Express bridges with actual PCIe slots which meant you could connect RAID cards, additional network cards, faster network cards, fibre channel cards and all sorts of exciting serverish type things. It meant for a while, a Mac Mini Server attached to one of these could actually act as a semi-respectable server. They still didn’t have ILO or redundant PSUs but Mac Servers could be at least be reasonably easily expanded and the performance of them wasn’t too bad.

Of course, Apple being Apple, this state of affairs couldn’t continue. First of all they released the updated Mac Pro. On paper, it sounds wonderful; up to twelve CPU cores, up to 64GB RAM, fast solid state storage, fast GPUs, two NICs and six(!) Thunderbolt 2 ports. It makes an excellent workstation. Unfortunately it doesn’t make such a good server; it’s a cylinder which makes it even more of a challenge to rack. It only has one CPU socket, four memory slots, one storage device and there is no internal expansion. There is still no ILO or redundant power supply. The ultra powerful GPUs are no use for most server applications and it’s even more expensive than the old Mac Pro was. The Mac Pro Server got discontinued.

Apple then announced the long awaited update for the Mac Mini. It was overdue by a year and much anticipated in some circles. When Apple finally announced it in their keynote speech, it sounded brilliant. They said it it was going to come with an updated CPU, a PCI Express SSD and an additional Thunderbolt port. Sounds good! People’s enthusiasm for the line was somewhat dampened when they appeared on the store though. While the hybrid SSD/hard drive was still an option, Apple discontinued the option for two hard drives. They soldered the RAM to the logic board. The Mac Mini Server was killed off entirely so that means that you have to have a dual core CPU or nothing. It also means no memory expansion, no RAIDed boot drive and the amount of CPU resources available being cut in half. Not so good if you’re managing a lot of iPads and Macs using Profile Manager or if you have a busy file server. On the plus side, they did put in an extra Thunderbolt port and upgraded to Thunderbolt 2 which would help if you were using more external peripherals.

Despite all of this, Apple still continue to maintain and develop OS X Server. It got a visual overhaul similar to Yosemite and it even got one or two new features so it clearly matters to somebody at Apple. So bearing this in mind, I really don’t understand why Apple have discontinued the Mac Mini Server. Fair enough them getting rid of the Mac Pro Server, the new hardware isn’t suitable for the server room under any guise and it’s too expensive. You wouldn’t want to put an iMac or a Macbook into a server room either. But considering what you’d want to use OS X Server for (Profile Manager, NetRestore, Open Directory, Xcode), the current Mac Mini is really too underpowered and unexpandable. OS X Server needs some complimentary hardware to go with it and there isn’t any now. There is literally no Apple product being sold at this point that I’d want to put into a server room and that’s a real shame.

At this point, I hope that Apple do one of two things. Either:

Reintroduce a quad or hex core Mac Mini with expandable memory available in the next Mac Mini refresh

Or

Start selling a version of OS X Server which can be installed on hypervisors running on hardware from other manufacturers. OS X can already be run on VMware ESXi, the only restriction that stops people doing this already is licensing. This would solve so many problems, people would be able to run OS X on server class hardware with whatever they want attached to it again. It wouldn’t cause any additional work for Apple as VMware and others already have support for OS X in their consumer and enterprise products. And it’d make Apple even more money. Not much perhaps but some.

So Tim Cook, if you’re reading this (unlikely I know), give your licensing people a slap and tell them to get on it. kthxbye

Ian’s Pasta Sauce Recipe

Right, I’m going to try to write about something other than bloody computers for once and share a recipe for a pasta sauce that I think I’m quite good at. I think that it’s classed as a ragout, I’m hesitant to call it a Bolognese as it has whole tomatoes in it rather than tomato puree and beef stock and if I did, I’d probably offend the whole city of Bologna and have a Nonna from there chase me down the road with a pickaxe. Anyway, it’s a good recipe and it generally goes down very well with my family when I do it. It’s a lot better than buying a jar of Dolmio or similar, it tastes better and you have a better idea of what’s actually in it!

Serves 6-8. It freezes well so if you don’t have 6-8 people to give this to, just put the leftover sauce in the freezer.

Ingredients

  • 500g Minced beef, 15-20% fat content
  • 2 tins of chopped tomatoes
  • 1 red bell pepper, coarsely chopped
  • 1 large chilli pepper, finely chopped (Optional)
  • 1 large red onion, finely chopped
  • 1 large carrot, grated
  • 4 cloves of garlic, crushed or finely sliced
  • 1 tbsp tomato puree
  • 50g hard garlic sausage such as salami, finely chopped. If you’re in the UK, Pepperami is a good one to use
  • 100ml red wine
  • Some herbs such as basil and oregano. Fresh preferably but dried will do too.
  • Salt and pepper
  • Mild olive oil

Method

Put a large pan on a medium to high heat. Heat around 2 tbsp of the olive oil until it’s very hot. Add the onion, pepper, chilli if you’re using it and the garlic sausage and fry it all for around five minutes or until the onion is soft and translucent, stirring occasionally. Add the garlic and fry for another minute. Make sure that you stir the mixture constantly so that the garlic doesn’t burn. Add the tomato puree and fry for another couple of minutes.

Add the beef mince and fry it until it’s browned. Add the grated carrot and cook for a minute. Add the red wine and let it simmer for about two minutes. Finally, add the chopped tomatoes and give the whole thing a good mix. Add a splash of water if you think it’s necessary. Bring the mixture to the boil then reduce the heat. Simmer for about 90 minutes to two hours or until most of the liquid has boiled off and the mixture thickens. Add the herbs and season the mixture with the salt and pepper towards the end.

Serve with your favourite pasta (I suggest tagliatelle) or with gnocchi. It’s quite good with an Italian hard cheese such as Grana Padano or Pecorino Romano grated over it.

I’m aware that this recipe is about as Italian as I am but I’m just end with “Buon appetito!”. Enjoy!

Variation

Sometimes I substitute the beef mince for sausages squeezed out of their skins. It’s very good with some of the flavoured sausages that you get in the more upmarket supermarket brands such as Taste the Difference or Finest (Other supermarket brands are available).

You can use this sauce for lasagnes and for pastistio. To make a pastistio, make up a quantity of the sauce and some cheese sauce. Cook some tubular pasta such as macaroni, penne or rigatoni. Mix it in with the pasta sauce and put the mixture into a ovenproof dish. Top it with the cheese sauce and bake for 30-40 minutes or until the top is browned. Take it out of the oven and let it stand for five minutes then serve.

Managing Macs using System Center Configuration Manager – Part Two

In my previous article, I described the agent that Microsoft have put into System Center Configuration Manager to manage Macs with. Overall, while I was happy to have some kind of management facility for our Macs I found it to be somewhat inadequate for our needs and I wished it was better. I also mentioned that Parallels, the company behind the famous Parallels Desktop virtualisation package for the Mac, contacted us and invited us to try out their plugin for the Mac. This article will explain what the Parallels agent is capable of and how well it works and how stable it’s proven to be since we’ve had it installed.

Parallels Mac Management Agent for ConfigMgr

The Parallels agent (PMA) is an ISV proxy for ConfigMgr. It acts as a bridge between your Macs and the management point in your ConfigMgr infrastructure. The agent doesn’t need HTTPS to be enabled on your ConfigMgr infrastructure, ConfigMgr sees Macs as full clients. The Parallels agent fills in a lot of the gaps which the native Microsoft agent has such as:

  1. A graphical and scriptable installer for the agent
  2. A Software Center-like application which lists available and installed software. Users can also elect to install published software if desired.
  3. Support for optional and required software installations
  4. Operating System Deployment
  5. The ability to deploy .mobileconfig files through DCM
  6. A VNC client launchable through the ConfigMgr console so you can control your Macs remotely
  7. Auto-enrollment of Macs in your enterprise
  8. Support for FileVault and storage of FileVault keys

It supports almost everything else that the native ConfigMgr client does and it doesn’t require HTTPS to be turned on across your infrastructure. In addition, if you use Parallels Desktop for Mac Enterprise Edition, you can use the plugin to manage VMs.

Installation

The PMA requires a Windows server to run on. In theory, you can have the PMA installed on the server hosting your MP or it can live on a separate server. There are separate versions of the PMA available for ConfigMgr 2007 and 2012/2012 SP1/2012 R2.

Earlier versions of the PMA didn’t support HTTPS infrastructures properly so you needed to have at least one MP and one DP running in HTTP mode. However, the latest version supports HTTPS across the board. However, you do need to have at least one DP running in anonymous mode for the Macs to download from.

IIS needs to be installed on the server along with WebDAV and BITS. Full and concise instructions are included so I won’t go over the process here. Anybody who has installed a ConfigMgr infrastructure will find it easy enough.

If you are using the OSD component, it needs to be installed on a server with a PXE enabled DP. If you have multiple subnets and/or VLANs, you will need an IP helper pointing at the server for the Macs to find it.

Software Deployment

The PMA supports two methods of deploying software. You can use either Packages or Applications.

Generally speaking, there are three ways to install a piece of software on a Mac, not counting the App Store:

  1. You can have an archive file (Usually DMG) with a .app bundle in to be copied to your /Applications or ~/Applications folder
  2. You can have an archive file with a PKG or MPKG installer to install your application
  3. You can install from a PKG or MPKG.

Installing using Packages

Unlike the Microsoft agent, you don’t need to repackage your software to deploy it with the PMA. To avoid doing so, you can deploy them using legacy style Packages. To deploy a piece of software using ConfigMgr Packages, you need to create a Package in the same way as you would for Windows software. You copy it to a Windows file share. You need to create a Program inside the package with a command line to install the package. Using the three above scenarios, the command lines would look like this:

  1. :Firefox 19.0.2.dmg/Firefox.app:/Applications:
  2. :iTunes11.0.2.dmg/Install iTunes.pkg::
  3. install.pkg

The first command tells the PMA to open the DMG and copies the Firefox.app bundle in the DMG to the /Applications folder. The second tells the PMA to open the DMG and execute the .pkg file with a switch to install it silently. The third runs an arbitrary command.

Once the package and the program inside the package have been created, you distribute to a DP and deploy it to a collection as standard.

Deploying software using this method is functional and it works nicely. The disadvantage is that there is no way to tell if a deployment has worked without either delving through the logs or looking in the Applications folder. If you deploy the Package to a collection, the PMA will try to install the Package on the Mac whether it’s already on there or not.

Installing using Applications

As of version 3.0 of the PMA, Parallels have started supporting ConfigMgr Applications as well as Packages. Parallels are using Microsoft’s cmmac file format to achieve this. This means that you need to repackage applications and add them to the ConfigMgr console using the same method as you do for the native ConfigMgr client. This is a bit of a pain but the benefits that doing so brings make it worthwhile. As with the Microsoft client, there are detection rules built into the Application meaning that the Parallels client can check to see if a piece of software is deployed on the Mac before it attempts to deploy it. If it is already there, it gets marked as installed and skips the installation.

It also brings this to the table:

pap

That is the Parallels Application Portal. Much like Software Center on Windows, this gives you a list of software that has been allocated to the Mac. It allows for optional and required installations. If a deployment is marked as optional, the software is listed with a nice Install button next to it.

As with the Microsoft agent, you need to be careful with the detection rules that you put into your Applications. The PMA runs a scan of /Applications and /Library folders looking for info.plist files. It extracts the application name and version from those PLISTs and adds them to an index. It then looks at the detection rules in the Application and compares them to the index that it builds. If there’s a match, it marks them as installed. If you don’t get the detection rules right, the PMA won’t detect the application even if it has been installed properly and then it eventually tries to reinstall it. I spent a very interesting afternoon chasing that one down. There are also some applications which don’t have info.plist files or get installed in strange places. The latest Java update doesn’t have an info.plist, it has an alias to another PLIST file called info.plist instead. The PMA didn’t pick that one up.

Operating System Deployment

Quite impressively, Parallels have managed to get an operating system deployment facility built into the PMA. It’s basic but it works.

First of all, you need to create an base image for your Macs and take a snapshot of it using the System Image Utility found in OS X at /System/Library/CoreServices/. You can create custom workflows in this utility to help you blank the hard disk before deployment and to automate the process. Make sure you tell it to make a NetRestore image, not a NetBoot image like I first did. Once you’ve done that, you tell it where you want to save your image and set it on its way. The end result is an NBI file which is a bundle with your system image and a bootstrap.

You then copy the resulting NBI file onto a PC or server with the ConfigMgr console and Parallels console addon installed. Once it’s on there, open the console and go to the Software Library workspace. Go to Operating Systems, right click on Operating System Images and choose Add Mac OS X Operating System Image. A wizard appears which asks you to point at the NBI file you generated and then to a network share where it creates a WIM file for ConfigMgr.

add image

Once the WIM has been generated, it adds itself to the console but one quirk I have noticed is that if you try to create it in a folder, it doesn’t go in there. It gets placed in the root instead. You can move it manually afterwards so it’s not a huge issue.

The next step is to create a task sequence. There is a custom wizard to this too, you need to right click on Task Sequences under Operating System Deployment in the Software Library workspace then choose Create Task Sequence for Macs.

task sequence

You give the task sequence a name, choose the image that you want to use and press the Finish button. Again, if you’re using folders to organise your task sequences and you try to create the task sequence in a folder, it will get placed in the root of the structure rather than in the folder that you tried to create it in. You can move it if you wish.

From there, it’s pretty much standard ConfigMgr. You need to distribute the image to a DP and publish the task sequence to a collection. The task sequence then appears to the Macs in that collection as a standard Netboot image with the title that you gave to it. You can access it the usual way, either through the Startup Disk pane in System Preferences or by holding down the option key on startup.

boot disk

Unfortunately, what it doesn’t do is allow for any kind of automatic, post image deployment sequences. Although in theory the System Image Utility is supposed to support custom workflows which allow software installations and the running of scripts, I haven’t managed to get it to deploy the agent automatically. I have therefore created a script which the admin deploying the Mac needs to run which (amongst other things) names the Mac and installs the PMA. From what Parallels tell me, this is being worked on.

DCM – Scripts and Profiles

The PMA supports the usage of DCM Bash scripts in the same way as the Microsoft agent does. There isn’t much to say about this, it works and it’s a useful thing to have. The other way of getting settings onto Macs with the PMA is via mobileconfig files generated either by Profile Manager in OS X Server or by a generator built into the ConfigMgr console addin. The generator looks like this:

profile

Look familiar? Unfortunately there aren’t all of the options here that are in Profile Manager so if you want to do something more advanced than what’s on offer here, you still need a copy of OS X Server and Profile Manager to generate the profile.

To deploy a mobileconfig file using the PMA, you need to go to the Assets and Compliance workspace, go to Compliance Settings and right click on Configuration Items. Go to Create Parallels Configuration Item then to Mac OS X Configuration Profile.

configprofile

You then give the configuration item a name, decide whether it’s a User or System profile and point the wizard at the mobileconfig file generated by Profile Manager. Once you’ve done that, there is a new configuration item in your console which can be added to a DCM baseline and deployed to a collection.

I have used this facility for various purposes; for configuring Airport, for joining the Mac to the AD domain, for setting up the user’s desktop settings and wallpaper, setting up Time Machine and for more besides. It’s a great facility to have and rather more user friendly than delving through PLISTS to find settings.

Other features – VNC Client, Auto-enrolment and  Inventory

Well, these do what they say on the tin. We have a VNC client:

vnc

It takes an inventory:

mac inventory

It has the ability to detect Macs on your network and automatically install the agent on them. They all work. There isn’t really much else to be said.

How well does it work?

So clearly, the PMA has far more features than the Microsoft agent does but a piece of software can have all the features in the world and still be useless if it isn’t stable. In this regard, I am delighted to say that the Parallels agent has been rock solid for us. It has been continually improved and has had feature after feature added. It doesn’t quite make a Mac a first class citizen on a Windows network but it comes very close and going by the way that Parallels have improved the product over the last two years, I’m confident that the gap will close. Parallels have been a lot quicker in getting new versions of OS X supported with their plugin too, it already has support for Yosemite.

It hasn’t been entirely problem free but when I’ve contacted Parallels Support, they’ve been quick and efficient and they’ve got the problem resolved. Most problems that I’ve come across I’ve managed to solve myself with a bit of thought.

Although Parallels claim that it the PMA can be installed on the same server as your management point, I couldn’t get it to work when I did this. I ended up putting it on its own hardware. This was with v1.0 of the product though, we’re now on v3.1 so they may have improved that since then.

Having the PMA has also meant that I no longer need a Magic Triangle set up to support and configure my OS X clients. I don’t need Profile Manager or Workgroup Manger to deploy settings, I don’t need OS X server or DeployStudio to deploy images. The only thing I need OS X Server for is Profile Manager to generate the profiles and (with the arrival of Yosemite) the System Image Utility.

The only real downside to the PMA is that it’s expensive and that you have to pay full price for it annually. That may be hard to justify if you’ve already spent a fortune on a System Center license.

Conclusion

So lets do a quick advantage/disadvantage comparison:

Microsoft client advantages:

  • Native client so no additional cost
  • Support from Microsoft

Microsoft client disadvantages:

  • Sparse feature set
  • Required installs only
  • Complicated DCM
  • Takes a long time to install multiple applications
  • Requires HTTPS
  • Slow to support new versions of OS X
  • No visible status indicators, next to impossible to see what’s going on.

Parallels client advantages

  • Includes lots of additional features, brings level of management of Macs to near parity of Windows machines
  • Optional and user initiated installs supported
  • Software Center equivalent and a System Preferences pane to check status of agent. Very thorough logs to check on what the agent is doing are available too.
  • OSD
  • Doesn’t require HTTPS
  • Supports SCCM 2007
  • Much easier to deploy settings by using mobileconfig files

Parallels client disadvantages

  • Expensive
  • Requires an additional DP
  • Probably requires an additional server to install the proxy

They’re as good as each other when it comes to running DCM scripts and taking inventories. So the answer is simple: If you can afford it, get the Parallels Management Agent. If I were the head of the System Center division at Microsoft, I’d be going cap in hand to Satya Nadella and telling him to drive all the money to Parallels HQ to acquire their product from them. Frankly, it puts Microsoft’s own efforts to shame.

Automating SCVMM Patching

Patching. What a pain in the backside it is. It’s bad enough when you just have to look after one or two machines at home. When you have thousands of the things to look after, it can be horrific.

A recent brace of high level, out of band Microsoft security updates has forced me to take a close look at our patching infrastructure at work and make some improvements. Among those improvements was to look at the patching of our Hyper-V hosts. I’m a little ashamed to admit it but I haven’t really looked at patching our virtual hosts since they were installed back in August 2013. This meant that when I went to install those updates, there were another 130 to install with them. Not so good.

We are using System Center Virtual Machine Manager to manage our virtualisation farms. Part of its functionality is keeping your hosts patched. To do so through the UI is a pretty long winded process, you have to add a WSUS server to the SCVMM infrastructure, synchronise SCVMM with WSUS, add any new updates to a baseline, scan the hosts against that baseline to see if they’re compliant and remediate them if they’re not. It’s simple enough but it’s long winded. Unfortunately there is no way to automate this process through the GUI.

However, thanks to PowerShell it can be scripted! I am using this script to mostley automate the process for me:

[powershell]

Import-Module virtualmachinemanager
Import-Module virtualmachinemanagercore

$anonUsername = "anonymous"
$anonPassword = ConvertTo-SecureString -String "anonymous" -AsPlainText -Force
$anonCredentials = New-Object System.Management.Automation.PSCredential($anonUsername,$anonPassword)
$PSEmailServer = "smtp.server.domain"

Get-VMMServer -ComputerName scvmm.domain.com
Start-SCUpdateServerSynchronization -UpdateServer wsus.server.domain

$2012Updates = $(Get-SCUpdate | Where-Object -FilterScript { `
$_.Products -like "Windows Server 2012" -and `
$_.IsSuperseded -eq $false -and `
$_.CreationDate -gt $((Get-Date).AddMonths(-1)) `
})

$Baseline = New-SCBaseline -Name "$(Get-Date -format y) Updates"
$AddedUpdateList = @()
$2012Updates | foreach {
$AddedUpdateList += Get-SCUpdate -ID $_.ID
}

$scope = Get-SCVMHostGroup -Name "Hyper-V" -ID "8db6c432-7326-429d-af6d-8c93d201ca9f"
Set-SCBaseline -Baseline $baseline -AddAssignmentScope $scope -JobGroup "a0bcf812-b866-474b-a69a-13db8f8ec360" -RunAsynchronously
Set-SCBaseline -Baseline $baseline -RunAsynchronously -AddUpdates $addedUpdateList -JobGroup "a0bcf812-b866-474b-a69a-13db8f8ec360" -StartNow

Start-Sleep -Seconds 10

Get-SCVMHostCluster -Name "cluster1.domain.com" | Start-SCComplianceScan
Get-SCVMHostCluster -Name "cluster2.domain.com" | Start-SCComplianceScan

Send-MailMessage -to "helpdesk@domain.com" `
-from "scvmm@domain.com" `
-subject "Time to patch the Hyper-V Farms" `
-Body "Dear IT Support,

Virtual Machine Manager has downloaded the $(Get-Date -format y) updates for the Windows hosts in the Hyper-V clusters. Please ask a member of the team to check the compliance status of the hosts in SCVMM and remediate any problems that are found.

TTFN,

SCVMM"`
-priority High `
-credential $anonCredentials

[/powershell]

That script connects to SCVMM, looks for any updates released for Windows Server 2012 in the last month and adds them to a new baseline named after the month. It assigns the new updates to a group called Hyper-V and starts a compliance scan on two clusters. Finally, it emails the helpdesk to let them know that there are new updates to be installed. That’s what the anonymous credentials are in there for, the account that I use to run the script doesn’t have a mailbox so our Exchange server rejects the message when it tries to authenticate using it.

If you’re feeling particularly brave, you can add this to the script:

[powershell]

Get-VMHostCluster -Name ClusterName | Start-SCUpdateRemediation -RemediateAllClusterNodes

[/powershell]

That will, in theory, cycle through each of the hosts in your cluster putting them into maintenance mode, migrate the VMs on that host onto another in the cluster, install the updates and reboot them. I say “In theory” because I haven’t managed to get this to work 100% reliably yet.

I then created a scheduled task which runs this script on the second Wednesday of every month.

Anyway, there you have it. Feel free to steal this if you want it but run it at your own risk, I’m not responsible if it does something unfortunate.

DCM Script – Disable On-Board Sound Card if USB Device is Attached

The first script in my new library is one that I am quite proud of as it was the first that I created to solve a relatively complex problem. It came to be because of the Music department at the college that I work at. The PCs in their department have external USB sound cards for students to plug MIDI instruments into and other such things (Hell, I’m not a musician, I don’t understand what they’re for exactly!). The problem was that Sonar, the music sequencing software that they use, was giving them trouble when both the on-board audio and the USB device was enabled. They therefore wanted me to disable the on-board sound card in the machines so that it wouldn’t happen again.

I could have gone to each of their PCs and just disabled the onboard sound in the BIOS or in Windows but that would be a short term fix; if the PC got replaced or rebuilt the person doing that would have to remember to disable it again. I therefore wrote this:

[powershell]

$SoundDevices = Get-CimInstance Win32_SoundDevice

if ($SoundDevices.DeviceID -like "USB*")
{
#USB Sound Card detected, will now check to see if on-board HDAUDIO is still active
$HDAudio = Get-CimInstance Win32_SoundDevice -Filter ‘DeviceID LIKE "HDAUDIO%"’
$AudioStatus = $HDAudio.StatusInfo
If ($AudioStatus -eq ‘3’)
{
#On-board still active, need to disable
echo "USB Audio detected, on-board audio needs to be disabled"
}
else
{
#USB detected, on-board disabled
echo "OK"
}
}
else
{
#No USB, onboard sound to be left alone
echo "OK"
}
[/powershell]

 

This script queries WMI to find out what sound devices are installed in the machine. If it detects one with USB in the device string, it goes on to see if the onboard HDAUDIO device is enabled. If it is, it sends a string back to ConfigMgr saying that remediation needs to happen. If the on-board is disabled it sends “OK” back to ConfigMgr. If there is no USB audio device installed at all, it sends “OK back to ConfigMgr.

The remediation script is quite simple. Since these are Windows 7 machines, there is no PowerShell way of managing hardware. I therefore had to use the old DEVCON command. The remediation script therefore looks like this:

[code]%Path_to_file%\devcon.exe disable HDAUDIO\*[/code]

That disables any device with HDAUDIO at the beginning of its device ID.

Set the compliance rule on the DCM script to look for a string that says “OK” and that’s it. Then add the rule to a existing baseline or create a new one and deploy it to a collection.

Managing Macs using System Center Configuration Manager – Part One

This post is about one of my favourite subjects, namely Configuration Manager, referred to hereafter as ConfigMgr . If you don’t care about the intricacies of desktop management, I suggest you look away now cos this ain’t gonna interest you!

Before I get too far into this post, I should mention that I’ve written about this subject before on my blog at EduGeek. The content here therefore isn’t that new but I’ve rewritten some if it and hopefully improved on it a little too. I will also say that all of this was tried almost two years ago now so chances are that things have changed a little with ConfigMgr 2012 R2. From what I understand though, most of what I’ve written here is still accurate.

Anyway, I spend a lot of time at work using ConfigMgr manage the computers on our Windows network. We use it to almost its full extent; we use it for software and update deployment, operating system deployment, for auditing software usage, for configuration of workstations using DCM and a fair bit more besides.

As well as having more than 1500 Windows PCs, laptops and servers, I also have around 80 Macs to manage as well. To put it mildly, they were a nuisance. They were essentially unmanaged; whenever an update or a piece of software came along, we had to go to each individual Mac and install it by hand. The remote administration tools that we were using (Apple Remote Desktop and Profile Manager) were woefully inadequate. ARD requires far too much interaction and hasn’t had any significant update since Leopard was released. Profile Manager does an OK job of pushing down settings but for software management, it assumes that the Macs are personal devices getting all of their software from the App Store. That’s not really good enough. We were desperate to find something better.

We had been using ConfigMgr to manage our Windows PCs for a couple of years by that point and we had recently upgraded to 2012 SP1 which featured Mac management for the first time. We figured that we may as well give it a go and see what it was like. This is what I found out.

First of all, ConfigMgr treats Mac clients as Mobile devices so this means that you have to set up an HTTPS infrastructure and install an enrolment point for your Macs to talk to. Your management point needs to talk HTTPs as do your distribution points. That also means that you need to allocate certificates to your PXE points and task sequence boot media if you want them to talk to the rest of your infrastructure.

Once you have all of this set up, you need to enrol your Macs. Bear in mind that I looked at this when ConfigMgr 2012 SP1 was the current version. I understand that the process has changed a little in 2012 R2.

First of all, you need to download the Mac Management Tools from here for 2012 SP1  and here for 2012 R2. This gets you an MSI file which you need to install on your Windows PC. That MSI file contains a DMG file which you need to copy to your Mac. In turn, that DMG file contains the installer for the Mac client, the utility for enrolling your Macs in ConfigMgr and an application repackager. You have to install the client first of all from an elevated terminal session. Once that’s installed, you need to run another command to enrol your Mac into ConfigMgr. Assuming that you get this right, it will download a certificate and you’re good to go. When I was setting up the Macs to use this, I found a very good blog post by James Bannan which goes into a lot more detail.

Once your Mac has been enrolled, you will want to start doing something useful with it. At the moment, the Microsoft client has the following abilities:

  1. You can deploy software
  2. You can install operating system updates using the software deployment mechanism
  3. You can check and change settings by using DCM to modify PLIST files
  4. You can check and change settings by using DCM and Bash scripts to return values and make changes
  5. The agent takes an inventory of the software and hardware installed on your Mac and uploads it to your management point.

Deployment of Software and Updates

Deploying software on the Macs is broadly similar to doing the same process on Windows computers; you need to add the software to ConfigMgr as an application, create a deployment type with some detection rules, distribute the software to a DP and deploy the software to an application. The one difference is that you need to repackage the software into a format that ConfigMgr understands. It has a specific format for Mac software called “cmmac”. This is essentially a refactored ZIP file with either a .app, a .pkg or a .mpkg with an XML file which has an inventory of the ZIP, installation instructions and some predefined detection rules. I don’t want to make this already long post any longer than it needs to be so I’ll link to Mr. Bannan’s blog again which has a very good rundown of the entire process.

Changing settings using PLIST files

This isn’t the simplest of processes but it is quite effective. The first step is to open the ConfigMgr console on your Windows PC and go to the Assets and Compliance workspace. From there, go to Overview then Compliance Settings then Configuration Items. Right click and click Create Configuration Item. This will bring up the following window:

Untitled

This example is going to set a proxy server against a network interface so I have named it appropriately and given it a description. Make sure that you set the Configuration Item Type to Mac OS X. Press the Next button

os selection

The next box lets you target your setting to specific versions of OS X. This screenshot was taken nearly two years ago when Microsoft hadn’t got around to adding Mountain Lion support. The current version supports up to and including Mavericks but not Yosemite (yet). Choose a specific version or not depending on what you need and press Next.

create setting

You then need to tell ConfigMgr what PLIST file you’re editing and which key you want to change. You also need to tell it if the key is a string, a number, a boolean value etc. Once you’ve done that, change to the Compliance Rules tab

edit rule

You need to add a rule for each setting that you’re changing. The one in the example above is setting the network name of the HTTP proxy server for the Ethernet interface on the Mac. To complete this example, you’d also need to set one for the HTTPS proxy, the port number and any proxy exceptions. Make sure that the Remediate is checked on any rules that you create and finish the wizard.

Once your compliance rule is completed, you will need to create a DCM baseline or add it to an existing baseline and deploy that baseline to a collection. I’m not going to go through the process here as it’s largely identical to doing it for a Windows PC.

Changing settings using Bash Scripts

This is probably the more powerful way of using DCM as you’re not relying purely on PLIST files to make your changes. If you can detect and remediate the setting that you want to change by using Bash, you can use the script here. This could be a setting in an XML file, a config file somewhere, a PLIST etc. I’m sure you get the idea. The process for creating a compliant rule using a script is largely similar to creating one for a PLIST and even more similar to creating one for a Windows machine. When you get to the third window, choose Bash Script in the setting type instead of Preference File. You get the opportunity to input two scripts; one to detect the setting and one to change it.

System Inventory

Again, this works in the same manner as it does for Windows machines, albeit not quite as detailed. At the very least, you get a list of the hardware and software installed on the machine and the agent keeps track of any changes made. Asset Intelligence and Software Metering isn’t supported however.

What can’t it do?

  1. OSD
  2. Remote Control
  3. Asset Intelligence
  4. Antivirus monitoring (Although it will deploy SCEP for Mac happily enough)
  5. Software Metering
  6. Power Management (Not easily anyway)

Results

So I’ve covered how it all works. The question that you may be asking now is “How well does it work?”. The answer two years ago was “It works OK… ish. Maybe”. I shall try to explain.

The whole thing feels very much like a v0.1 beta rather than proper release software. It’s functional up to a point but there are some very rough edges and the functionality is nowhere near as strong on the Mac (and presumably Linux too) is it is on a Windows PC.

For starters, you can only deploy applications to machines and not to users. You can’t have optional installs. There is no Software Center so you can’t easily see what software has been deployed and what software is supposed to be deployed. When the agent detects a deployment, it comes up with a sixty minute countdown, the length of which can’t (or couldn’t) be changed. You can tell the Mac to start deployment when you see the countdown but if you’re deploying (say) six pieces of software and you leave the Macs unattended, the countdown comes up, expires, installs the software then the next countdown comes up, expires, installs the software and so on. It can take hours for multiple deployments to finish if you’re not paying attention.

I also found that the detection of deployments was rather erratic too. Just like with Applications for Windows PCs, there are detection rules which ConfigMgr uses to determine whether a piece of software is installed on the Mac or not. The ConfigMgr client is supposed to use the detection rules to detect whether the Application is installed or not and skip installation of deployed applications if it detects that’s it’s already present. Unfortunately the detection process seemed rather erratic and our Macs had a habit of repeatedly trying to install software that was already there. The process then fails because the installer detects that the software is there already and throws an error. The process then restarts because ConfigMgr still thinks it’s not there. This tended to happen with more complex Applications which use PKG installers to deploy rather than Applications which copy .app files. I do have a theory as to why this happens but I noticed this about two years later. When you repackage the application using CMAppUtil, it automatically generates the detection rules for you. With PKG installers, it puts a lot in there. I think that maybe it puts too many in there so it’s looking for a load of settings it can’t detect despite the software being present. Unfortunately I haven’t managed to test the theory but it makes sense to me.

Another gotcha that I’ve found with the repackager is that sometimes, it gets the installation command wrong, especially when you run it on a Mac with more than one operating system installed on it. It sometimes gets the path to install to wrong, necessitating a change in your installation command line.

DCM works nicely but finding the PLIST files or the setting that you want to change via Bash can be troublesome. That said, it’s no worse than trawling through the registry or finding an obscure PowerShell command to do what you want on a Windows machine.

Rather mysteriously, Microsoft didn’t include a remote control agent with this. Considering that a VNC daemon is baked into all versions of OS X, this would be trivial to implement,

The real bugbear that my team and I had with the Microsoft client is that Microsoft were very slow to implement support for new versions of OS X. As I’m sure you know, Apple have been working on a yearly release model for major versions of OS X since they released Lion. Microsoft didn’t support Mountain Lion for six full months after Apple had released it on the App store. The delay for Mavericks support wasn’t much better and Yosemite isn’t supported at all right now. It wouldn’t be so bad if it were a case of “Oh, it’s not supported but it’ll probably work”. Unless there is explicit support for the OS X version in the client, it won’t.

So in conclusion, the Microsoft client is better than nothing but it’s not that good either. When my friend and colleague Robert wrote a brief piece about this subject on his blog, he got a message from the lovely people at Parallels telling him about a plugin they were writing for SCCM which also happens manage Macs. Stay tuned for Part Two of this article.

*Update*

Part two of this article is now up. If you want to see how this story ends, please click here

James at the NIA, 22/11/2014

The initial draft of this post merely said “:D”. Suffice to say, I had a good time.

The band James have been touring promoting their new album, La Petit Mort. On Saturday, they played the National Indoor Arena in Birmingham and had StarSailor supporting them. My girlfriend and I were in the audience, her birthday gift to me!

I wasn’t familiar with StarSailor’s work before I saw them here, a fact my girlfriend finds incredible, but on the strength of their set here I will be seeking out their music. That isn’t something I often say about support acts so make of that what you will!

James then came on opening with Sound. Tim Booth apologised for missing the high notes in the song but he sounded perfect to me! They played the bulk of the songs from their new album saying that Walk Like You was their favourite of them and going by that performance I can see why. He went crowdsurfing during the performance of Frozen Britain begging the audience to look after him. They played some of the old favourites during the set such as Laid, Out to Get You, Getting Away With It (All Messed Up)Hymn From a Village and Come Home. For the last song of the main set, Tim Booth invited some members of the audience who he had seen dancing up on the stage and asked them to dance saying that the definition of good dancing was being able to lose yourself in the music. They then played Gone Baby Gone from the new album and they all danced around like maniacs!

During the encore, they played Born of Frustration, Interrogation from the new album and closed with Sometimes. During Born of Frustration Tim Booth and Andy Diagram went up to the upper tiers of the hall and walked, played and sung with the people in the seats. He went crowdsurfing again during Sometimes and when the band finished the song, the crowd didn’t and continued signing the chorus. Eventually the band played the song out again and Tim Booth said that couldn’t be topped and the set ended.

I’ll admit that I’m completely biased because James are one of my favourite bands but it was a fantastic gig and I would go to see them again in a heartbeat. The only slight disappointment of the night was that they didn’t play Sit Down but you can’t have everything I suppose!

Cashing In – Queen Forever album review

I was listening to the Chris Evans Breakfast Show on Radio 2 on my way to work a few weeks ago. Brian May and Roger Taylor were guests. They were talking about their upcoming album and the new tracks that were appearing on it. This piqued my interest. He even played a couple of the new tracks which was quite exciting.

I am a huge fan of Queen. I have loved them since I was about eight. I remember quite well the day that Freddie Mercury died and being very upset about it. I was 11 years old. I have all of their studio albums in one form or another. I have all three of the Greatest Hits albums (They were bought for me before I got the studio albums), I have the Rocks compilation album, I have a couple of the live albums, I have the Mr. Bad Guy solo album and I even have (and quite like!) the Barcelona album that Mercury recorded with Monserrat Caballe. So when I heard that they were releasing a new album with new content on it, I did what any fanatic does and I went to iTunes to pre-order the Deluxe version of Queen Forever. Fast forward on a few weeks and it’s in my iTunes library waiting to be listened to.

What a crushing disappointment it is. Out of the three tracks, only one is actually new (Let Me In Your Heart Again). The other two, Love Kills and There Must Be More to Life Than This, are remixes of two of Freddie Mercury’s solo songs.

The Love Kills mix is quite good, it’s slowed down a little and there is some nice acoustic sounding guitar playing in there along with Red. It’s a good mix but the reverb effects are a little excessive. There Must Be More to Life Than This is the more interesting of the two. It features vocals from Michael Jackson. Apparently There Must Be More to Life Than This was supposed to be a collaboration between Queen and Michael Jackson from the start. Both Michael Jackson and Freddie Mercury made some recordings but the project eventually got sidelined and the song ended up being developed by Mercury alone and appeared on his solo album.

The remaining members of Queen haven’t made it a secret that Jackson and Mercury didn’t actually sing together on this. They both recorded the song separately and the Michael Jackson recordings were thought to be either lost or buried somewhere. They recently found a high quality copy of the Michael Jackson vocals and put the two together in the mixing studio. The trouble is, I think it shows. I don’t know how to explain it, I’m not a sound engineer but it sounds to me that there is a level of post-processing on Mercury’s voice that isn’t there on Jackson’s. When the two are put together, there is a really jarring difference between the two recordings. It puts me in mind of the recent mashup of We’ll Meet Again with Katherine Jenkins and Dame Vera Lynn. That had a war-time or slightly post-war recording of Dame Vera singing then a modern sounding recording of Jenkins. The difference in quality between the two, although inevitable, was absurd and distracting. The same, although to a much lesser degree, is happening with the new Queen track.

Of course, it could just be the mix that is on the album. Brian May mentioned during the interview that there was a mix that he did and one that William Orbit did. The William Orbit version is on the album, the Brian May one isn’t. I’d like to hear the Brian May one and hear the differences.

As for the rest of the album, it’s yet another compilation album. There are some more obscure tracks on here, a lot from their earlier albums in the 70s and some of my absolute favourites such as Love of my Life, ’39 and In the Lap of the Gods. All of the tracks on the album been remastered which is nice if you can hear that sort of thing (I can’t). The second disk on the Deluxe album has five tracks from Made in Heaven which is, frankly, three too many. I quite like that album but they have far stronger songs which they could have put on there.If you haven’t got these songs and if you’re unfamiliar with Queen’s earlier work, I think this would be a fantastic compilation. The trouble is, I do have these songs. I bought them long ago. Having them on yet another bloody compilation does nothing to enhance them. I have paid £12 for one song that I’ve not heard, remixes of two songs that I have and 33 remastered songs of which I already own copies. I know that it was down to me to make sure I knew what I was buying but it feels like I’ve been conned and I’m not especially happy about it. The whole thing feels like a cynical money grab, especially considering how close it is to Christmas.

So in conclusion, buy the album if you haven’t heard Queen’s earlier material. Just buy the new songs if you have.