A question that is bouncing around now is: Should I plan on using RemoteFX for everyone?
Short answer: probably not.
Long answer: …
Let’s step back. In 1996 you probably (or your colleagues/predecessors) gave everyone a laptop or PC. Then we all started hearing about this thing called server based computing. The big players were Citrix with something called WinFrame, based on Windows Server NT 3.51 (when it required real IT pros to network a PC or server). Not long after that the Citrix/MS relationship changed and we had Terminal Services which could be extended by the Citrix solution.
That’s about when some sales & marketing people (we didn’t have bloggers back then, did we?), and yes, a few of us consultants, started shouting that the PC was dead … long live the server!
Things didn’t quite work out like that. Instead, Terminal Services (and the rest) usually became a niche solution. It was great for delivering awkward applications to end users, especially when they were remote to some server, working from home or in a branch office. But the PC still dominated end user computing.
Not long after, I remember a rather large consultant colleague from Berkley who derided me for learning more about Windows. Didn’t I know that the Penguin would rule the world?!?!?! Hmm … anyway …
Then a few years ago we saw how server virtualisation was being modified (with the help of a broker) to take the remote client of server based computing and provide connectivity to centrally located VMs with desktop operating systems. VDI hit the headlines. I’ve swung back and forth on this one so many times that I feel like a politician in election season. At first I loved the idea of VDI. It gave us the benefits of Terminal Services without the complexity of application compatibility (application silos) while retaining individual user environments. Then I hated it. The costs are so high compared to PC computing and you actually need more management systems instead of less. And now I’m kinda swinging back to liking it again.
This is because I think it fits nicely in as part of an overall strategy. I can see most people needing PC’s. But sometimes, VDI is the right solution when people need an individual working environment that won’t be interfered with and they need it to be centralised. But sometimes remote desktop (terminal) services (RDS) is the right solution. That’s because it gives that centralised environment but at very dense rates of user/server that just cannot be matched by VDI. And guess what: sometimes you need PC, VDI and RDS all in the same infrastructure, just for different users.
But let’s get back on track. What about RemoteFX? Would every user not want it? And what the hell is RemoteFX?
RemoteFX is a feature of Windows Server 2008 R2 with Service Pack 1. In other words, it’s a few weeks old (after heavy public beta/RC testing). It allows Hyper-V VDI hosts or RDS session hosts to take advantage of one or more (identical) graphics cards in the physical server to provide high definition graphics to remote desktop clients. That solves a problem for some users who want to use those graphics intensive applications. Without RemoteFX, the graphics suck as bitmaps are drawn on screen. But RemoteFX adds the ability to leverage the GPUs, combined with a new channel, to smoothly stream the animation over the wire. It also allows client-attached USB devices to be redirected to the user session without the need for drivers on the client. Sounds great, eh? Everyone on VDI or RDS should have it! Or should they?
You can find the hardware requirements for RemoteFX on TechNet. And this is where things start to get sticky. A user with a single normal monitor will require up to 184 MB of video card RAM. That doesn’t sound like much until you start to think about it. I’ve done a little searching on the HP side. The largest card that they support is a NVIDIA Quadro FX5800which has 4 GB RAM. That means that a HP GPU can handle 22 users! You can team the cards but you can only get so many into a host. For example, one of the 3 or 4 servers that HP supports for RemoteFX is the 5U tall ML370 G6 (not your typical virtualisation or session host spec) and it only takes 2 cards. That’s 44 users which is not all that much, especially when we consider large multi-core CPUs, huge RAM capacities, SAN storage, and Dynamic Memory. I don’t think this is a failing of RemoteFX; I think this is just a case of applications needing video RAM. This type of technology is still very early days and video card manufacturers are watching and waiting.
There are special rack kits that contain lots of video processor/RAM capacity that can be hooked onto servers. One of my fellow MVPs is using one of these. They work but they are expensive.
And then there’s the other requirement: network. This stuff is designed to work with 1 Gbps LANs, not for WANs.
So back to where we started: Should I plan on using RemoteFX for everyone? For most people the answer will be no. There will be a very small number who will answer yes. Think about it. How many end users really do need the features of RemoteFX? Not all that many. Implement it for everyone and you’ll have more, bigger hosts, hosting fewer users. You’ll also be limited to using it in the LAN.
I think we’re back to the horses for courses argument. Maybe you’ll have something like the following or a variation of it (because there are lots of variations on this):
- A lot of PCs/laptops on the main network
- Some people using VDI with whatever broker suits them
- Some GPU intensive applications being published to PCs/laptops/VDI via RDS session hosts
- And a measure of App-V for RDS and ConfigMgr to take care of it all!
The PC is dead! Long live the PC! But what about the iPad? *running while I still can*