The virtualization management industry (all aspects of physical systems monitoring, virtual systems monitoring, applications monitoring, and systems/applications provisioning) is undergoing an extremely rapid rate of change. In 2011, VMware put two very important stakes in the ground. The first was to combine performance management, capacity management, configuration management with self-learning analytics in vCenter Operations. The second was to state that its forward going management strategy was around delivering OPEX savings through automated remediation (monitor–>remediate–>notify). These VMware initiatives will shape how management plays out in 2012, and create new winners and losers.
Articles Tagged with vSphere
The speed at which technology changes is absolutely amazing in that as soon as you buy something, the next faster, bigger model comes out. I think back to when I started my career and remember a workstation that I was using with a 200MHz processor and I was really thrilled when I got it bumped up to 64MB of ram. Now, although the hardware was changing at blazing speeds, you used to know you had a three to five year run with the operating system before you had to worry about upgrading and refreshing the operating systems. VMware has been changing the rules the last few years on major releases coming out, every two years.
In part one I looked at the overall macro trends in the desktop virtualization market, now in part two I want to look at what to expect from key vendors and vendors: Microsoft, Citrix, VMware, and AppSense as well as product groups such as thing client and storage vendors. All with an eye to Desktop Virtualization in 2012.
Since the virtualization industry was largely created by VMware, is largely being defined by VMware, and is (currently) be lead by VMware it is worth taking a look back at 2011 from the perspective of VMware’s steps, key strategic directions, and missteps.
The Virtualization Practice was recently offline for two days, we thank you for coming back to us after this failure. The reason, a simple fibre cut that would have taken the proper people no more than 15 minutes to fix, but we were way down on the list due to the nature of the storm that hit New England and took 3M people off the grid. Even our backup mechanisms were out of power. While our datacenter had power, the rest of the area in our immediate vicinity did not. So not only were we isolated from reaching any clouds, but we were isolated from being reached from outside our own datacenter. The solution to such isolation is usually remote sites and location of services in other regions of a country, this gets relatively expensive for small and medium business, can the Hybrid Cloud help here?
On 9/22 was held the Virtualization Security Podcast featuring Anil Karmel, Solutions Architect at Los Alamos National Library (LANL), to discuss their implementation of secure multi-tenant Cloud. LANL makes extensive use of the entire VMware product suite from vCloud Director down to the vShield components to implement their SMT cloud. They have also added into their cloud their own intellectual property to improve overall cloud security. It was a very interesting conversation about the state of SMT today.