The Virtualization Practice

Does an evaluation for a virtualisation project need to be only an exercise in understanding if X hosts will on Y servers? Will you be able to to virtualize every service you deliver? Are new applications required? What are your existing service-levels and requirements across your application portfolio? In most enterprises today, IT is a cost centre not a profit centre. Business units often want detailed involvement in implementation plans, asset purchases and ownership: it is not unusual that requests for applications come in terms of functionality – not in terms of service levels. With their release of Workspace iQ, Centrix Software appear to be unique in endeavouring to aggregate information that can be used to deliver data that can help provide IT with improved costing information without relying on specific vendors solutions to be in place.

During the last Virtualization Security Podcast, our guest had to postpone so we discussed to several interesting topics all related to Digital Forensics and how encryption would best work within the virtual environment. Our very own Michael Berman, in a previous life was a forensic investigator and had some great insights into the problem of digital forensic within the virtual environment.

Both Infrastructure Performance Management and Applications Performance Management vendors who are targeting the virtualization and cloud markets have realized that new and unique data is needed in order to performance manage these new environments and the applications that run on them. This is dramatic departure from the old physical world where most vendors simply relied upon the data that were provided via standard OS API’s to infer systems and applications performance.

GestaltIT Tech Field Day: Virtualization Line Up

I participated in GestaltIT’s TechFieldDay which is a sort of inverse conference, where the bloggers and independent analysts go to the vendors and then discuss the information they have received. We visited the following virtualization vendors:

* vKernel where we were introduced to their Predictive Capacity Planning tools
* EMC where we discussed integration of storage into the virtualization management tools as well as other hypervisor integrations
* Cisco where CVN and CVE were discussed in detail.

Depth vs. Breadth in Virtualization Performance Management

Enterprises who are going to support business critical and performance critical applications on a virtual infrastructure should at the minimum address two needs. The first is to get a true and complete picture of Infrastructure Performance based upon Infrastructure Response Time. The second is to put in place the tools required to monitor these applications in production.

GestaltIT Tech Field Day: Storage Line Up

I participated in GestaltIT’s TechFieldDay which is a sort of inverse conference, where the bloggers and independent analysts go to the vendors and then discuss the information they have received. We visited the following storage vendors:

* Data Robotics where we were introduced to the new Drobo FS
* EMC where we discussed stretched storage and other interesting futures
* HP where we were introduced to the IBRIX products

One thing I have learned in the time I have spent working in IT is that no software product, out of the box, will do everything that you want it to do. This especially goes for VMware’s vCenter Server. This is a great product but yet still has its shortcoming. vCenter will perform a lot of the tasks that we need to do and has the ability to report on a information we need to know about in our virtual environments but unfortunately not everything we need to know about can be easily found in bulk about multiple servers.

Since coming out with VMware vSphere and Virtual Infrastructure Security: Securing the Virtual Environment, I have continued to consider aspects of Digital Forensics and how current methodologies would be impacted by the cloud. My use case for this is 40,000 VMs with 512 Servers and roughly 1000 tenants. What I would consider a medium size fully functioning cloud built upon virtualization technology where the environment is agile. The cloud would furthermore contain roughly 64TBs of disk across multiple storage technologies and 48TBs of memory. Now if you do not think this exists today, you were not at VMworld 2009, where such a monster was the datacenter for the entire show and existed just as you came down the escalators to the keynote session.