One great feature of System Center Configuration Manager 2012 (ConfigMgr) is the new compliance settings and configuration baselines. In ConfigMgr 2007 this was known as Desired Configuration Management.
In ConfigMgr 2012 Microsoft really raised their game and now allow for automated remediation, which I primarily use for registry settings. How annoying is it when you configure an application to not self update when you install an update (probably via ConfigMgr with System Center Updates Publisher) and it resets the settings and merrily checks for updates – usually leading to calls to the Service Desk along the lines of “My computer is telling me there is an update to application X but it won’t let me install it”?
This is where the awesome compliance setting remediation comes in – it can detect a change, and if instructed to do so in the compliance setting, change the value to what YOU have told it to be, not what the application developer wants it to be.
Group Policy Objects
Group Policy Objects (GPOs) give you ultimate control over a domain joined client (be that server or desktop). If you’ve got the Microsoft Desktop Optimisation Pack (MDOP) then you’ve got access to Microsoft’s Advanced Group Policy Management (AGPM) tools – which are fantastic. MDOP is well worth and it’s cheap (yes that is cheap and Microsoft in the same sentence). It allows you to log change to GPOs, do offline testing and loads more. But what if the left hand doesn’t know what the right hand is doing?
If someone authors a change to a GPO that could potentially change something fundamental, for example changes the Remote Desktop firewall settings, how can you monitor that in ConfigMgr?
Enter Microsoft’s Security Compliance Manager (SCM). You’re probably thinking “What the <insert expletive here>!” Bear with me…
Microsoft’s Security Compliance Manager
SCM is a free Solution Accelerator (of which there are many) from Microsoft that can guide you in deploying GPOs that can help secure your Windows servers and desktops with best practise guidance, documentation galore, and best of all the ability to export CAB files for use in ConfigMgr.
In SCM you can import your existing GPOs and from there you can compare them to Microsoft’s guidance. In addition you can export them to a CAB file for use in ConfigMgr. Big deal? In my opinion – YES! You don’t have to use the comparison aspect, you can just use it as a conduit for the next stage.
In the ConfigMgr console you can import the CAB file in to the compliance settings workspace – this in turn generates an array of compliance settings for you. When you dig a little deeper into these settings you find it uses scripts to check compliance, no auto remediation available here but does a good job of checking settings.
What about just opening the raw ADMX files to find the registry settings?
Rather you than me!
If your GPOs that only contain a few settings you can open the parent ADMX file, find the registry strings and use those for remediation if you want… I don’t know about your environment but that would be a boat load of work for me!
So where’s the benefit?
If you’ve got these settings imported in ConfigMgr you can see when the deployed baselines move away from their GPO settings, this can immediately alert you to one of two things:
- An update, whether that be from Microsoft or another company (remember you can control quite a lot of applications via GPOs, not just Microsoft’s – Google Chrome anyone?), may have changed a value you configured in a GPO
- Or more likely, someone has changed something and not let you know. Now if you’re using AGPM you’ll be able to find the individual and have a little chat…
This is not a catch all. If someone deploys a new setting via a GPO (one that isn’t covered by a compliance setting imported via SCM) you won’t know about it. Communication is key here, make sure left hand knows what the right is doing.
I’d advise you to take a look at the free Solution Accelerators from Microsoft, of which Microsoft Deployment Toolkit (MDT) is one – which I’ve used for years and is amazing for highly configurable desktop deployments. SCM is great tool to see what Microsoft recommend you do with your infrastructure, Windows is now quite secure out of the box but if you want to you can harden it much more. Best of all it tells you what you need to do, where you need to do it and most importantly why!
Just remember that most registry changes require a reboot to take effect. Just because you remediate a setting it doesn’t necessarily mean the setting is in effect – look at TechNet and do your research.
System Center 2012 is not a single product – it is Microsoft’s collection of management software that now covers a wide range of functions including monitoring your infrastructure (servers, switches, SAN, etc.), backing up your entire Microsoft server estate, deploying operating systems (both client and server), creating private clouds, IT service desk, automation of tasks. The list goes on. The software is vast and covers so much!
So what does System Center 2012 mean to me?
There are 3 applications within System Center 2012 (SC2012) that I use day in day out (one of which saves users a lot of hassle):
Configuration Manager (ConfigMgr): this is all about users and devices and what should be available to them (applications, updates, operating systems). Since implementing ConfigMgr I’ve done a complete desktop operating system refresh (more about that later) and gone from hoping Windows updates installed to knowing what is, isn’t, and is yet to be, installed on where and when.
Virtual Machine Manager (VMM): I run a 4 node Windows Server 2012 Hyper-V cluster which was (mostly) built without the aid of VMM – it took the better part of 3 days to build over Christmas, I’m sure with VMM it would’ve taken about a day – tops. It helps me to manage my Hyper-V cluster, create/manage my private clouds and get a high-level view of my SAN.
Data Protection Manager (DPM): this is probably my favourite SC2012 application – odd to say “I love backup” but it is so simple to use and so effective it makes everything so much easier.
I’ve been using the Microsoft Deployment Toolkit (MDT) since 2008 to deploy operating systems and it is a great product, I really can’t believe they give it away free! It sounds crazy but all MDT is a bunch of scripts and a very small database. In the past I’ve looked at the lines that said “this feature is only available with Zero Touch Interface (ZTI)” with a longing for ConfigMgr. So the day after I installed ConfigMgr I installed MDT 2012 Update 1 on the server and decided a full desktop operating system (OS) refresh was required – we’d just got our Software Assurance (SA) sorted on our desktop licences – to get everyone on Windows 7 Enterprise.
The process is the same as MDT: build, capture, deploy and relax! It’s not quite that easy but not too far off! The biggest difference is that you don’t really need to use the MDT console you can do most it through the ConfigMgr console. The one thing I think MDT does better than ConfigMgr is drivers.
Drivers in MDT are easy – create a decent folder structure, import the drivers you need for each model and away you go. If you’ve got duplicates MDT is fine with that, ConfigMgr however isn’t… By allowing you to duplicate drivers in MDT you can create silos for each model you have, safe in the knowledge that it’ll work. ConfigMgr however just gives you some errors when you try to import duplicate drivers which then involves trawling log files to find out what. If I’m doing it wrong then please let me know!!!
Anyway, after days of work to get the drivers imported, categorised, tagged and built into driver packages I was ready to try a different model from my test box.
So I added some WMI queries in my task sequence to determine what hardware the operating system (OS) was about to be deployed on so it could determine which driver package to use. This worked perfectly and Windows 7 deployed successfully.
Working for a small organisation I thought it would be better to get user specific applications installed at deployment time rather than giving the “gold” image and then waiting for ConfigMgr to figure who the user is and what they require – so I went back to MDT to see if I could use the roles feature like I’d done previously.
I created a database, created a role, linked some ConfigMgr applications up to the role (could be MDT applications but then you could be duplicating applications – enter headache), populated the computer record in MDT attached the role and redeployed the OS and went for a coffee.
The OS deployed as it should, the additional applications were installed and that was that. Cue multiple roles! The overall refresh went reasonably well, a few small issues but they were with poorly written applications rather than any specific ConfigMgr problems.
When ConfigMgr went wrong…
Picture the scene: it’s just before service pack 1 for SC2012 is released and I’m eagerly awaiting its Windows Server 2012 support when Microsoft release its now infamous Windows Management Framework (WMF) 3.0 update for Windows 7. At this point I was quite excited, PowerShell 3 on Windows 7 – more cmdlets than you can dream of and all the fun that PowerShell brings! So after diligently deploy it through ConfigMgr to all the client machines I started to notice that ConfigMgr wasn’t happy…
The clients were going inactive. My immediate thought was something was wrong with the server as every client was inactive. So after extensive diagnosis I determined the server was fine, time to turn to my attention to the clients. So I pop on to a well-known search engine and find other people are having the same issues – I thought “phew not just me then” – but there was no answer. Cue hours of tinkering, uninstalling and reinstalling the client software – same result. After adding a few more worry line,s and looking in the mirror each morning for grey hairs, some news came from Microsoft, and I’m paraphrasing it here: Don’t install WMF 3.0 if you’re using ConfigMgr 2012 RTM, it breaks it… Great.
My first thought was not suitable to written on this blog, safe to say it involved many, many four letter words…
So what to do? My ConfigMgr install is “officially” broken and I’ve not got a clue what to do. Back to the well-known search engine… Many days later after piecing bits together I create a VB script, and several batch files, to sort the problem out – not the official way of resolving the problem but I wasn’t about to admit defeat… Scripts get executed overnight I come in the next morning, open the ConfigMgr console, take a deep breath and click on Devices, wait for the console to load (slowly turning blue as it takes forever – or so it seemed) and finally breathe – success!
Then SP1 was released… The person at Microsoft who used the wrong digital certificate on the release should be – hang on this heading in to four letter word territory. Moving on…
ConfigMgr has allowed me to ensure that everything is how I want, and need it, to be. Through the use of compliance settings I am able to ensure that Group Policy Object (GPO) settings are applied and maintained (auto-remediation is a big help in this area), application settings are set and maintained (no more “there is an update available”, see annoying Adobe Flash notices to end users) and that if something does move away from what is needed I know about it and can find out why.
Virtual Machine Manager
VMM for me isn’t directly used to create amazing private clouds that users can log into, create VMs based on templates etc. My organisation isn’t big enough for those features – it’s my sanity checker.
When adding another node to my Hyper-V cluster I used the failover cluster manager to validate the cluster when adding the extra node and it all came back green as expected. So instead of finishing the process in there I went to VMM and ran the validation from there.
VMM seems to be much more aware about Windows Server 2012 networks than Windows Server 2012! Whilst I had all the correct network adapters configured with the right subnets, VLANs, etc. there was one that had a minute difference to the other nodes. VMM picked this up and said “NICs don’t match”. Some head scratching later I realised what I’d done and changed the properties of one of the Hyper-V virtual switches, reran the VMM validation and all was good.
Whilst I’m certain that the cluster would’ve been fine without this change it is good to know that the product designed for this purpose really knows its stuff!
VMM allows me to view my SAN storage quickly and to see where I’ve got spare capacity and what may need adjustment. Combine this with VMM’s knowledge of my VMs and any thin provisioning I’ve done and I can see how much storage I have and what would happen if all my thinly provisioned storage suddenly filled up!
Data Protection Manager
I’ve been using DPM since 2011 when our business continuity provider told me about the product. Before then I’d never heard of it and I don’t know why not! DPM is Microsoft’s answer to Backup Exec, NetBackup, etc. but with one big difference – it’s a Microsoft product and as such DPM doesn’t play with non-Microsoft products. Currently there are no agents for operating systems created by other vendors (maybe SP2, or whatever they’re rebranding service packs to, will bring agents for other OSs).
DPM leverages Volume Shadow Services (VSS) to perform its backups. Back in the old days every time you made a change to a file the archive attribute was set on the file so the next differential/incremental backup would find the files with the archive attribute set and backup then up; consequently resetting the archive attribute. This meant the entire file was backed up. What if the file you changed was 1GB? What if you only changed a lower case “a” to an upper case “A”? The whole file was backed up. What a waste. What if there was a way of only backing up only the part of the file that changed? Cue VSS.
By leveraging VSS DPM makes one complete backup of the file, then tracks changes to the file at block level, it knows what changed and where in the file and backs that up. Below is a rough example of what happens.
You create a protection group that has a retention policy of 3 days on disk. You have a 1GB file that you make changes to every day that is stored on an operating system protected by DPM. When you first protect the OS DPM creates an entire copy of the file. Each green block below represents a block within the file (not entirely accurate but enough to show how it works):
So after DPM has done its initial copy of the file and you make a slight change in the file the VSS service tracks this change:
On the next DPM synchronisation it copies only the changed block (no more copying a 1GB file each time you backup). Bearing in mind that DPM can synchronise protection groups as frequently as 15 minutes (great for file shares when users are making lots of changes) it can keep the network traffic from peaking and troughing.
So at the next synchronisation DPM knows it already has all the green blocks and the red block. If you then make further changes to the same file VSS tracks this; in the example below DPM would only backup the blue blocks.
After 3 days (as per the protection group dictates) the first recovery point is flushed into the original copy of the file, this means if you restore from the oldest recovery point you have at this time the copy of the file you’ll get back will contain all the green blocks and the red block. As time progresses each recovery point is flushed into the “original copy”. Combining this with long term backup (standalone tape/tape libraries/virtual tape libraries) you can enable rapid restoration of recent backups with the safety of long term retention if needed.
Just a quick note on synchronisations and recovery points – you can specify a sync schedule of 15 minutes but you can only have 64 recovery points (this is hard coded in VSS). So if you need to have 10 recovery points created per day you can only have a retention period of 6 days (no part days allowed). If you specify a sync schedule of 15 minutes DPM will copy the changed blocks every 15 minutes but will not create a recovery point until it is scheduled to, at which point it combines all the synchronisations between the last recovery point and the current time, mash them together and create a recovery point.
As DPM leverages VSS the above functionality is available for any VSS enabled application, for example SQL Server, Exchange server, SharePoint (DPM is amazing with SharePoint). You even use this with Hyper-V server VHDs! If you’re using a storage appliance that is VSS aware you can leverage this at the hardware level (much faster than Hyper-V VSS).
You can integrate DPM with Active Directory so it can expose itself to end users. For example you can combine it with file shares for end user restores using the “Restore Previous Versions” feature of Windows. This can equate to less service desk calls (if the users are trained properly).
Other System Center Applications
There are several other SC2012 applications:
- System Center Operations Manager: monitors everything in your IT estate, you can even get it to look inside .NET applications to see what is happening in the code and see what is slowing the whole thing down or causing it to crash!
- System Center Orchestrator: takes your repetitive tasks and automates them. Imagine you have a script that you run to fix a problem but the problem happens at random times how can this be automated? If you can capture the problem in Operations Manager you can get it to execute a run book in Orchestrator to fix it for you and then tell you what it has done!
- System Service Manager: this is basically a service desk solution that hooks into all the other SC2012 products. It uses the best practises found in Microsoft Operations Framework and Information Technology Infrastructure Library (ITIL)
- System Center App Controller: gives you a portal that you can manage applications on on-premise clouds and the Windows Azure platform
- System Center Endpoint Protection: Microsoft’s enterprise anti malware product (here malware includes viruses) that uses ConfigMgr. It is light weight and uses all the existing software update and reporting infrastructure you’ve built with ConfigMgr
- System Center Advisor: Not strictly part of SC2012 but is available to you if you have the necessary licencing and allows you to monitor cloud based applications
In SC2012 Microsoft have unified all their management products that were previously available as separate products. The changes in SC2012 have reflected the overall changes in way IT is used by business and end users. SP1 introduced further changes to the products to include the latest developments from Microsoft including incorporating Intune, Azure, Windows 8 (including the Windows Store), Windows Server 2012, Microsoft Desktop Optimisation Pack 2012 (MED-V 2, App-V 5, DaRT), Apple, Linux, etc. (far too many things to list).
Some of the SC2012 suite is evolutionary some of it is revolutionary!