Desktop virtualization is becoming more prevalent. Technology for delivering it is maturing, and adoption is increasing. But desktop virtualization (or VDI, to give it the commonly used acronym) may often be deployed with the wrong approach or goals in mind. It has been a buzzword for so long that, like “cloud,” it is often encouraged by management without a true understanding of what it can deliver and what benefits it brings.
I’m standing on the cusp of a new desktop virtualization project myself, but for the first time, I’m looking at a blank slate. Too often I’ve joined projects when they have hit snags or are ready to move into the implementation phase. In such instances, I often find myself asking “why?” when looking at technological choices or architectural decisions. More often than not, there is no possibility—or simply not enough courage—to make a change.
These problems stem primarily from having no clearly defined goal for the desktop virtualization project. The project becomes something that was implemented “because everyone is doing VDI,” or the company has the notion that one of the VDI “features,” such as the ability to provide a desktop at home identical to the one the user gets at work, will lead to immense user satisfaction. But it is important to be much more specific in providing a tangible benefit that users will see as a boon. For this current project, I am setting one definite goal, which is, simply:
The virtual desktops should have noticeably better performance than the existing physical desktops.
I know that “noticeably” is a completely subjective thing, and naturally I will set some more tangible baselines (such as x% quicker logon time, x% quicker application launch, x% more IOPS, etc.) I also know that speed isn’t everything—reliability, familiarity, ease of use: all of these will be factors. But the major goal that I want to achieve is entirely performance based.
What we have to remember is that for the users, it is all about the applications. Users now have tablets and laptops at home that run quick and fast, often with the marked ease of use and ease of deployment that goes hand in hand with app stores and the like. If they’re in the habit of comparing their home devices with those that are provided at work, you’ve got a fight on your hands to win their hearts and minds. But that is what you should be aiming for, because, as I covered in a previous post, allowing users to develop a negative image of a desktop virtualization project is an easy way to make the project a failure before it has even a chance to succeed. The VDI project will be something new, and it is perfectly acceptable for users to expect it to be better. If you received a new mobile phone, and it wasn’t as good as the one you’d just gotten rid of, you’d be a bit annoyed, right?
How, then, does one set about making sure that users can actually get better performance from a desktop virtualization project? Don’t forget, in many projects—such as this one—you may be right up against it from the start because of the previous infrastructure. You may be moving from a purely physical environment to a purely virtual one—making the infrastructure highly dependent on back-end systems and network technologies.
Without wanting to discuss particularly the dependence of desktop virtualization on web services, databases, network connectivity, and other factors that all have some bearing, the main thrust of performance efforts should be concentrated on storage. There’s no getting away from it: the performance of your storage will have a direct bearing on how good your virtual desktops appear to users.
This is another area where desktop virtualization projects can run into trouble. Too often, storage technology has been implemented as part of an earlier infrastructure refresh, where the drivers were not the same. When you look at deploying virtual desktops, the storage side of the equation should be reassessed and improvements factored in. It may sound crazy, but often desktop virtualization projects are embarked upon without any attention to the performance of existing storage technology. Desktop storage requirements differ significantly from server storage requirements; they are used in different ways.
Solutions exist that allow you to accelerate storage using RAM and Flash (Atlantis and Infinio provide such solutions, to name just two). Coupled with other benefits, such as in-line deduplication and compression, and ease of scale and repair, these technologies can make a real difference in the way your virtual desktops perform. If your budget allows it, you should always give consideration to these “non-traditional” storage methods that have become something of a disruptive influence. In fact, the performance that you can get for the price point has become a lot better over the last couple of years. If you’re serious about delivering a high-quality performance level from virtual desktops, then you really shouldn’t be ignoring the new guys on the storage block.
There’s another possibility that many neglect to think about: forgoing the hypervisor and investing in dedicated resources for users through a hosted blade workstation system like HP’s Moonshot. Again, budget will be the pressing issue here, but if you want to ensure that users will always get a particular level of service without any capacity planning or testing, then the blade workstation route may be one you need to consider.
But that’s not to say that storage is the only consideration (or investment, if you prefer) you need to take into account in order to deliver performance from virtual desktops that can outstrip that of physical ones. A whole host of factors are in play that you need to be aware of: storage is just the most pertinent one. Don’t neglect to pay attention to the others!
That said, setting goals such as the one I’ve discussed here is absolutely key to the success of any desktop virtualization project. The most overlooked piece of the VDI jigsaw puzzle, from what I’ve seen, is the user experience. No matter how much cost reduction or management simplification you can claim, if the users have a bad time, then the project will be deemed a failure. Set yourself goals for desktop virtualization that put the user at the heart of everything, and implement those goals correctly, and you may just find that you have delivered a resounding success.
Share this Article:
Latest posts by James Rankin (see all)
- Anatomy of a Desktop Virtualization Project #3: PoCs and Pilots - July 8, 2015
- Can One OS Rule Them All? - April 29, 2015
- In Search of the “Nirvana App”: Cloudhouse, Next-Generation Application Virtualization - March 19, 2015