The Virtualization Practice

Cloud Computing

Cloud Computing focuses upon how to construct, secure, manage, monitor and use public IaaS, PaaS, and SaaS clouds. Major areas of focus include barriers to cloud adoption, progress on the part of cloud vendors in removing those barriers, where the line of responsibility is drawn between the cloud vendor and the customer for each of IaaS, PaaS and SaaS clouds, ...
as well as the management tools that are essential to deploy in the cloud, ensure security in the cloud and ensure the performance of applications running in the cloud. Covered vendors include Amazon, VMware, AFORE, CloudSidekick, CloudPhysics, ElasticBox, Hotlink, New Relic, Prelert, Puppet Labs and Virtustream.

Amazon’s Service Level Agreement (SLA) is so narrowly-drawn that it could easily be argued that the recent Elastic Block Store (EBS) outage wasn’t a failure of Amazon Web Services at all. Anyone using EBS in a production environment was, arguably, reaping the fruits of their own folly. Of course they don’t tell you when you read the hype that architecting for resilience in the Cloud is actually very complicated, particularly if you want to take the sensible step of not relying on a single provider like Amazon, no-matter how dominant their hype may be.

Running VMware on legacy infrastructure is like driving a Ferrari on a gravel road. If you look at what is run in most production VMware environments today, the only really new things in the environment is VMware vSphere, and possibly some new monitoring, security and backup tools. We have barely started to reinvent everything that needs to be reinvented in order to properly take virtualization, IT as a Service and public clouds to their logical and most beneficial conclusions.

I was reading the post Small Business Virtualization and that really got me thinking about Small to Medium Businesses and what part Cloud Computing will play in that market. There are plenty of small businesses in and around my area and I have a couple of friends that are the owners of a couple of these small businesses. A majority of these small businesses have a single or a couple of point of sale machines that feed to the accounting program. It is these businesses that I think of when I think of what a small business is. Would virtualization help these companies? Sure, I think so but would it really be worth the cost to setup and maintain?

EMC, the majority owner of VMware, has agreed with the Department of Justice not to acquire 33 Virtualization Patents from Novell as part of a side-transaction in the acquisition of Novell by Attachmate. The Statement from the Department of Justice sheds significant light on the deal that had been struck between Novell and a newly-created company formed by Microsoft, EMC, Apple, Oracle to acquire a portfolio of patents for $450M, and the anti-trust threat that the Department of Justice saw to the Open Source community. And whilst the spotlight has been on Microsoft’s role, it seems that the role of EMC in seeking to acquire Virtualization patents was at least as concerning to the Department of Justice.

When CloudFoundry was announced, my first thought was this is a nightmare waiting to happen. Why do I think this, because I was not thinking about Open Source developers but enterprise developers and the biggest issue with enterprise development is that the data used by developers is either made up data, but more often than not is actual production data. So the question becomes how can such data be protected when using PaaS public clouds?

VMware’s latest effort, CloudFoundry, is not about VMware delving into the PaaS market even deeper. They have done that already with VMforce. CloudFoundry on the other hand is a fairly astute move to enable the development and rapid adoption of cloud based applications. The end goal is to sell what makes up a PaaS environment which is more enabling software. This would enable enterprises and businesses to move to the cloud. The problem with them moving now is that there are not that many applications that are cloud friendly. In effect more concentration on the application and less on the operating system which has always been VMware’s strategic direction.

Facebook (which had previously bought commodity servers and rented data center space) has opened up a whole new area of Open Source technology by publishing the full specification of both its new custom server and its new data center as “Open Source” at OpenCompute.org. Overall, Facebook claims that its new data centers are 38 per cent more efficient than its existing leased data centers, but the cost is about 20 per cent less. Published data (such as it exists) indicates that Facebook is at or ahead of rivals or peers such as Microsoft and Google. OpenCompute designs are released under new set of Open Source agreements. The intent seems to be to allow innovation within the published specification, but to ensure multiple providers of the technology. Facebook is clearly seeking to get multiple tier-1 third-party providers for both servers and data centers according to these designs, turning these Open Source specifications into a form of de-facto Standard, which could have broad impact by driving the marketplace away from shared storage models (such as Red Hat’s IAAS reference architecture) to local-storage-friendly IAAS architectures such as OpenStack or Eucalyptus.

Harris Trusted Cloud – Closing the Gap

On the 4/7/2011 Virtualization Security Podcast, we were joined by Wyatt Starnes of Harris Corporation. Wyatt is the Vice President of Advanced Concepts of Cyber Integrated Solutions at Harris. What this means, is that Wyatt is one of the key folks of the Harris Trusted Cloud initiative. Trust is a funny word, and we have written about that in the past. Harris’ approach is unique in that they are attempting to ensure integrity of all components of the cloud down to the code level, not just the network with their target being the hosted private cloud and NOT the secure multi-tenant public cloud.