At first glance, desktop virtualization seems fairly simple. Just deliver an operating system image, tune it, set the policies required, persist the user state, and you’re good to go. Right?
Desktop virtualization solutions have so many moving parts that they’re never going to be that straightforward. It’s important to set concrete goals to make sure you are working within your operational parameters. I touched on this subject in a previous article: Setting Appropriate Desktop Virtualization Goals. Once you’ve got this part down, you need to move on to the crucial application analysis stage. As I like to repeat whenever possible, “it is all about the applications.”
A Software Estate Overview
In an ideal world, every enterprise would know exactly what applications they have in use, how often they are used, and by whom. Unfortunately, software inventory is rarely, if ever, taken as seriously as hardware inventory—in many cases, it is not done at all. I have yet to work on a single project where the business had a complete overview of its software estate; many simply shrug their shoulders when confronted with questions around it. The problem is further exacerbated when companies are composed of disparate business units, or have grown by acquisition. It gets even worse when you factor in “seasonal” applications that may only be used at specific times of year or in particular circumstances. In worst-case scenarios, you may even be dealing with applications that the IT department isn’t aware of—certain business areas may have subscribed to SaaS applications they use online, or they may have developed things themselves that have become critical to their own particular area, such as Access databases and the like.
So, when one embarks upon the application analysis phase, the knowledge in the IT department may be incomplete or simply unfit for its purpose. Surveying users and taking time and resource metrics can’t be relied on to provide the information required: users often forget some of the tools they use. In this phase, an automated tool is probably the best bet for information gathering. This should be supplemented with user and IT administration knowledge.
Selecting an Analysis Tool
Think about your application and architecture requirements before selecting an analysis tool for your desktop virtualization project. Start with the applications you are certain are used widely, and work your way down from there. The list below, while not exhaustive, should provide a good baseline for understanding your applications, and it can guide the choices you make at the design stage.
The basics: The application name, the application vendor, the current version in use, the latest version available, any support agreement (including pertinent info such as whether the vendor will support the application if it is running on a virtual platform).
Supported platforms: What operating systems can the application run on? Windows Client (x64 or x86), Windows RDS, Linux, OSX, iOS, Android, etc.?
Application scope: Which groups will require the application? I normally divide this into Core (used by all users), Departmental (used by specific departments) and Group (used by specific subsets of departments, such as lawyers in Legal, accountants in Finance, etc.). Depending on the business, you may need to delineate this differently.
Licensing: How many users need the application? How many licenses do you have? What is the license type (per user, per device, usage count, etc.)?
Application usage: How often is the application used? Is it used every day (such as email), or only once a year (such as software for financial end-of-year calculations)? You can divide this into categories (such as heavy, medium, light) or be more specific (daily, weekly, monthly, etc.)
Delivery type: How can the application be delivered? Does it need to be natively installed, or can it be packaged through App-V, ThinApp, or the like? Can it be published on XenApp or RDS? Is it a SaaS application delivered through the browser? Can it be done through containerization tech (such as Spoon)? Can it be used via layering technologies like Unidesk or App Volumes? Is the app portable or even standalone? Technology like Citrix’s AppDNA or ChangeBASE from Dell (formerly Quest) can be very helpful for this specific information
Footprint: How much of each resource does the application typically use (memory, CPU, bandwidth, IOPS, etc.)? Does it have any potential conflicts with other applications?
Choosing an Application Delivery Mechanism
You should be getting a feel now for how vital this information is—particularly the delivery type. Identifying and selecting the application delivery mechanism(s) required will allow you to plan for the technologies you’ll need to encompass in your desktop virtualization solution.
What tools exist to help us acquire some or all of this information?
Citrix AppDNA and Dell ChangeBASE have already been mentioned for identifying delivery types. They both take source files and produce compatibility reports, identifying whether applications can be virtualized or published. For this particular information, they offer an excellent insight. Another compelling reason to choose AppDNA is that it’s now part of the Platinum license offering on XenDesktop. If you’re a Citrix VDI customer, AppDNA should be an automatic addition to the armoury for this phase.
There are perhaps four other technologies you should consider.
- Liquidware Labs Stratusphere FIT
- Lakeside SysTrack
- Dell (formerly Quest) Workspace Assessment
- Flexera AdminStudio Virtual Desktop Assessment
Your choice of tech will depend on a number of factors—not least whether you have any existing investment from the companies involved—but of these four, Stratusphere FIT and Flexera AdminStudio Virtual Desktop Assessment provide the most complete overview based around the parameters we have set. Stratusphere has a ten-day trial version available. Flexera offers a twenty-one-day trial. While most of these technologies are dedicated tools for VDI planning, Lakeside’s offering includes ongoing monitoring and tuning beyond the application analysis phase can be a deciding factor in choosing which to use.
Clearly, the time you spend in the analysis phase is vital. You need to balance getting a good overview of the applications in use against delaying the project unduly. I recommend spending at least a month on this, although the exact amount of time allotted will depend on a number of other factors in play.
Getting Off to a Good Start
Coupling one of the assessment technologies we have mentioned with knowledge from the IT department and the users should help you get your application analysis phase off to a good start. It cannot be stressed enough how important accurate information at this stage: the information will allow you to make the right selection of technologies in the solution you are putting together. There are always applications that slip through the assessment, but if you manage to get at least 90% of them captured at this point, you will have limited the potential re-engineering of your solution as much as possible. At the end of the day, your users just want their applications to work, and work well. The analysis phase of your desktop virtualization project is a key part of ensuring that required applications will be available when you get to the implementation stage.
Share this Article:
Latest posts by James Rankin (see all)
- Anatomy of a Desktop Virtualization Project #3: PoCs and Pilots - July 8, 2015
- Can One OS Rule Them All? - April 29, 2015
- In Search of the “Nirvana App”: Cloudhouse, Next-Generation Application Virtualization - March 19, 2015