In my Enterprise Desktop Strategy paper that was released back in September 2009, I defined what organizations should be considering as they look to incorporate desktop virtualization into their environments. I explored the different components, processes, and tools that brought the concept of the enterprise desktop together. Four years have gone by, and we have been through several “year of VDI” hype cycles. What we have learned is that as much as Desktop Virtualization is an innovative solution, it can also be disruptive if it is not properly integrated as part of the whole desktop service.
The desktop is a service, just like your phone, electricity, or cable service, so you need to start thinking about how you keep the service running. There are many facets of the desktop that are affected by IT services in and out of your organization. So when you are planning your desktop service, you should be cognizant of all the different applications and services and how these pieces play into the overall desktop experience.
Getting a Hold on Your Current Service Inventory
A desktop transformation initiative does not need to be a forklift replacement of your desktop environment. Understanding what you have in terms of hardware, software, operational processes, and tools can help build a foundation. Then look at what compliance requirements you have for your environment. Will your desktop service be affected by any corporate compliance requirements (GLBA, HIPPA, PCI, etc.)? What do you have to do to make sure that you adhere to these standards?
While decoupling the components of the desktop (hardware, operating system, applications, and user data) is still a critical part of the thought process to create the desktop service, it is the application layer that requires the most preparation.
Your first step is to build a formal application inventory. This should be done by both automated and manual methods. An agent-based solution that is deployed to each of the desktops will collect the installed application information and can also gather usage data. The manual part of the process, which includes interviewing the users, will validate what you have collected. However, it may also identify applications that have low frequency of use but high importance to the user. In addition, the tools may also fail to capture SaaS-based applications or Microsoft Office Macro-based applications.
Once you have your inventory, you need to take the time to weed out the unutilized or duplicate applications. It is not uncommon for organizations to have multiple versions of applications or to have two or more applications that perform similar functions. You may have an older version of Microsoft Office because you have specific macros or applets developed around it, but as you move forward in your desktop service definition you will need to set some baseline application standards, which could mean that everyone needs to be on the same level of Office to make the service supportable.
Procurement benefits of rationalization could mean thousands of dollars in license maintenance for your organization. It will also simplify the reporting and management of such licenses. Operationally, you will have less to support, which increases your desktop service TCO.
It is important to look at the application inventory and formally determine if it will function on your proposed desktop platform. Some applications may install and perform a first run without any issue, but it may be other features deep in the product that can fail. For companies that have hundreds or thousands of applications, investing in an application compatibility tool (Citrix AppDNA or Dell ChangeBase) will provide you with a standard approach to verify compatibility of legacy and new applications. It can also become part of your operational process as new applications or application updates, as well as new operating system versions, are introduced. These tools also have the ability after their analysis to provide guidance on remediation of non-compliant components and can output a distribution package.
Application packaging is the process by which you capture the installation routine of an application and put it into a standardized distribution format or package. The process enables the administrator to customize the application’s settings so that it will not require interaction by the user to configure it on first use. Having all of your applications packaging in a standard format also eases integration and maintenance.
It is essential to have a central method for application distribution. Whether you choose traditional installation methods (MSI formats) or application virtualization (AppV, ThinApp), this capability enables administrator and user control of provisioning and de-provisioning of applications and services to the desktop.
Once you have developed your inventory, created standard packages for each of your applications, and implemented a method of distribution, the next step is to establish a maintenance methodology. New applications and application updates should be put through a certification process to limit any possible negative affects they may have on the desktop platform. You should establish a maintenance window so that your organization knows when scheduled updates are occurring.