Windows Server 2012 RTM in Production as Backup Media Server

As Brad Anderson suggested a few times this year we are leading our businesses to a better IT future. Here’s a quick screen shot of one of our backups media agent servers hooked up to bunch of NL-SAS disks (150TB). One host (older model, no 10Gbps yet, running Windows 2008 R2) is sending backups over its dedicated 1Gbps interface to this media agent server who has a 10Gbps interface. Performance and traffic looks great Smile. I love it when a plan comes together!

image

 

This is what I looks like at host being backed up

image

Now we’ll see what magic Data Deduplication can do for us with the RTM bits!

Windows Server 2012 KMS Service Activation

Now we have the Windows Server 2012 and Windows 8 RTM bit form our Volume License we need to get some housekeeping done. The first thing we do is setup or update our KMS Service.

In our case it is running on Windows Server 2008 R2 so we need to do a couple of things.

Install the following update: An update is available for Windows 7 and Windows Server 2008 R2 KMS hosts to support Windows 8 and Windows Server 2012 as described in KB2691586. This is also the place where you can request this hotfix.  If you don’t install this hotfix registering a Windows Server 2012  KMS will throw an Error: 0xC004F050 The Software Licensing Service reported that the product key is invalid

So request the hotfix and install it from an elevated command prompt. Just follow the instructions and you’ll be fine Smile

image

Once you’ve clicked OK the installation will start

image

After that’s finished you will be asked to restart the server. Do so. Just restarting the KMS service ("net stop sppsvc" and "net start sppsvc") doesn’t suffice.

image

Now we have that out the way we can start putting our brand new KMS key into action.

Let’s take a look at what is already running:

slmgr.vbs /dlv => clearly the Windows 2008 R KMS key
image

Uninstall the current KMS key using slmgr.vbs /upk, please use an elevated command prompt Winking smile

image

 

Now you can install the new KMS key. The key listed here is obviously a demo one Winking smile If you run in to any issues here, restarting the KMS Service can help. Try that first.

slmgr.vbs /ipk NOPEI-AMNOT-GIVIN-GITTO-YOU!!

image

 

Now activate your brandnew KMS key running slmgr.vbs /ato

image

 

Show what’s up and running now by running slmgr.vbs /dlv again and as you can see we’re in business to activate all our Windows Server 2012 and Windows 8 hosts. Life is good Smile

image

Windows Server 2012 Bits Available for Download for Volume Licensing Customers

A long awaited day has arrived. The bits of Windows Server 2012 RTM are available to us. Ever since the BUILD conference in September 2011 a lot of us have been diving into this version with enthusiasm and amazement for what’s in the product. As a matter of fact I’ve “sold” projects based on Windows Server 2012 internally since October 2011 because we were that impressed with what we saw.

  • Grab the bits on the Microsoft Volume Licensing site (from August 16th onwards). I whish I could tell you it’s also on TechNet or MSDN but no joy there so far.

So we’ve been pouring over the product and the information available, gradually gaining a better understanding of what it can do for us and our businesses. That meant building labs, testing scenarios, presenting on the subject at various occasions.  There was also a lot of thinking, dreaming and discussing ideas and options about what we can do this version of Windows. It has been very busy for the past 11 months but I’m also very happy to have had the opportunity to attend several summits and conferences where I met up with colleague, fellow MVPs, MSFT employees who all shared the enthusiasm for this release and what it means for Hyper-V and the Private/Hybrid Cloud.

 

So to all of them, ladies & gentlemen, my on line community buddies form all over this planet, it’s been a blast Smile. They have been very helpful in all this as have been al the Microsoft employees who’ve answered and discussed all the questions/ideas we threw at them. I would like to thank all of them for their time, their patience and the opportunities given to us. I can offer those guys & galls just one reward: the fact that from day one we are taking this in production and gradually will do so for all our infrastructure systems and so on. It’s a no brainer when you’ve worked with the RC and seen what Windows Server 2012 can do. And no, I’m not forgetting Windows 8. SMB 3.0 & Direct Access and Windows to go alone make that a sweet proposition, but I got those bits already.

image

Well, the downloads are running and the installation of our first production Hyper-V Cluster and infrastructure servers can start as soon as that’s finished (we’ll lead Brad, we’ll lead Winking smile). After some initial tests these will be taken into service and that last feedback will provide us with the go or no go for the rest of our infrastructure. The speed & completeness of our move depends partially on how fast System Center 2012 SP1 brings support for Windows Server 2012.

So future blog posts and my next presentations will spiced with some real life production experience with the RTM bits. May all your roll outs be smooth ones!

Disk to Disk Backup Solution with Windows Server 2012 – Part I

Backing Up 100 Plus Terabyte of Data Cheaply

When dealing with large amounts of data to backup you’re going to start bleeding money. Sure people will try to sell you great solutions with deduplication, but in a lot of scenarios this is not a very cost effective solution. The cost of dedupe in either backup hardware or software is very expensive and in some scenarios the cost cannot be justified. It’s also not very portable by the way unless in certain scenarios in which you stick with certain vendors. Once you get into backing up  > 100TB you need to forget about overly expensive hard & software. Just build your own solutions. Now depending on your needs you might want to buy backup software anyway but forget about dedupe licenses. Some of the more profitable hosting companies & cloud providers are not buying appliances or dedupe software either. They make real good money but they rather spend it on SUVs and swimming pools.

What Can You Do?

You can build your own solution. Really. You can put together some building blocks that scale up and out. You’ll a dual socket server with two 8 core CPUs and 24GB of ram, perhaps 32GB. Plug in some 6Gbps SAS controllers, hook those up to a bunch of 3.5” disk bays with 12 *2TB or 3TB disks each and you’re good to go. You can scale out to about 8 disk bays if you don’t cluster. Plug in a dual port 10Gbps card. You’ll need that as you be hammering that server. If you need more than this system, than scale out, put in a second, a third, etc. 3.5TB –4TB of backup capacity per hour in total should be achievable..

image

When you buy the components from super micro and some on line retailers you can do this pretty cheap. Spare parts you say? Buy some cold spares. You can have a dozen disk on the shelf, a SAS controller and even a shelf if you want. You could use hardware redundancy (RAID, hot spares) or use storage pools & spaces if you’re going the Windows Server 2012 route and save some extra money. Disk bay failure? Scale out so that even when you loose a node you still have tree others up and running. Spread backups around. Don’t backup the same data only to the same node. I know it’s not perfect for deduplication with Windows 2012 that way but hey, you win some, you lose some. Checks & balances right?  If you need a bit more support get some DELL PowerVaults or the like. It depends on what you’re comfortable with and how deep your pockets are.

You can by more storage than dedupe will ever save you & still come out with money to pay for the electricity. Okay it’s less good for the penguins but trust me, those companies selling those solutions would fry a penguin for breakfast everyday if it would make them money. Now talking about those penguins, the Windows Server 2012 deduplication feature could be providing me with the tools to save them Smile, but that’s for another post. I hope this works. I’d love to see it work. I bet some would hate to see it work. So much perhaps that they might even consider making their backup format non dedupableDevil?

Tip for users: Don’t use really cheap green SATA disks. They’re pretty environment friendly but the performance sucks. My view on “Green IT” is to right size everything, never to over subscribe and let that infrastructure work hard for you. This will minimize the hardware needed  and the performance is way better than all the power saving settings and green hardware. Which will ruin the environment anyway as you’ll end up buying more gear to compensate for lack of performance unless you’ll just suffer the bad performance. Keep the green disks for the home user’s picture, movie & music collection and use 2TB/3TB SAS/NL-SAS. Remember that when you don’t cluster (shared storage) you can make due nicely without the enterprise NL-SAS disks.

Now I’m not saying you should do what I suggest here, but you might find it useful to test this on your own scale for your own purposes. I did it for the money. For the money? Yup for the money. No not for me personally, I don’t have a swimming pool and I don’t even own a car, let alone an SUV. But saving your company a 100.000 or more in cash isn’t going to get you into trouble now is it? Or perhaps this is the only way you’re going to afford to back up that volume of data. People don’t throw away data and they don’t care about budgets you’d better be able to restore their data. Which reminds me, you will also need some backup software solutions that doesn’t cost an arm and a leg. That’s also a challenge as you need one that can handle large amounts of data and has some intelligence when I comes to virtualization, snapshots etc. It also has to be easy to use, as simple as possible as this helps ensure backups are made and are valid.

Are we trying to replace appliances or other solutions? No, we’re trying to provide lots of cheap and “fast enough” storage. Reading the data & providing it to the backup device can be an issue as well. Why fast enough? Pure speed on the target side is not useful if the sources can’t deliver. We need this backup space for when the shits really hits the fan and all else has failed.That doesn’t have to be a SAN crash or a SAN firmware issue ruing all your nice snapshots. It can also be the business detecting a mistake in a large data set a mere 14 months after the facts when all replicas, snapshots etc. have already expired. I’m sure you’ve got quality assurance that is so rock solid that this would never happen to you but hey, welcome to my world Sarcastic smile.