Tag Archives: virtualization

The People behind Virtualization Technologies

DesktopVirtualizationAs technologists, we tend to focus on a product’s technology itself. How does the software, hardware, appliance, widget, or whatever work? While this is certainly an important consideration, the people who design, build, sell, and support the product may have the greatest impact on the product’s usability. Continue reading The People behind Virtualization Technologies

4 Reasons The Calxeda Shutdown Isn’t Surprising

DataCenterVirtualizationHP Moonshot SystemThe board of Calxeda, the company trying to bring low-power ARM CPUs to the server market, has voted to cease operations in the wake of a failed round of financing. This is completely unsurprising to me, for a few different reasons.

Virtualization is more suited to the needs of IT

Calxeda’s view of the world competed directly with server virtualization in many ways. Take HP’s Project Moonshot as an example. It is a chassis with hundreds of small ARM-based servers inside it, each provisioned individually or in groups, but with small amounts of memory and disk. The problem is that this sort of model is complicated, fragile, inflexible, and not standards-based. At the end of the day, organizations want none of these things. Calxeda’s solution may save an enterprise money by consuming less power, but it spends that money with increased OpEx elsewhere. In contrast, virtualization of larger, more powerful CPUs is more flexible on nearly every level, reduces the amount of hardware an enterprise must manage, and can help contain both capital and operational expenses while solving actual problems.

There are diminishing performance returns in extreme multi-core applications

Originally stated to convey the increasing value of a network as more nodes joined, another way Metcalfe’s Law can be expressed is that the communications overhead in a network grows as the square of the number of nodes in that network. This is also true in multi-threaded applications, where the amount of interprocess communication, locking, and other administrative work to coordinate hundreds of threads ends up consuming more CPU time than the actual computational work. Calxeda’s vision of hundreds of CPU cores in a single system was ambitious, and needed computer science and the whole industry to catch up to it. Enterprises don’t want research projects, so they choose fewer, faster cores and got their work done.

A limited enterprise market for non-x64 architectures

ARM isn’t x86/x64, so while there are increasing numbers of ARM-based Linux OS distributions, mostly thanks to the immense popularity of hobbyist ARM boards like Raspberry Pi and the BeagleBoard, none are commercially supported, which is a prerequisite for enterprises. On the Windows side there is Windows RT, which runs on 32-bit ARM CPUs, but it is generally regarded as lacking features and underpowered compared to other Atom-powered x86 devices that run full installations of Windows 8. Windows RT isn’t a server OS, either, and there is very little third-party software for it due to the complexity of developing for the platform and the lack of ROI for a developer’s time and money. Why put up with all the complexity and limitations of a different architecture when you can get a low-power x86-compatible Atom CPU and a real version of Windows?

A limited market for 32-bit CPUs

On the server front, which is what Calxeda was targeting, enterprises have been consuming 64-bit architectures since the release of AMD’s Opteron CPUs in 2003. Ten years later, the idea of using 32-bit CPUs seems incredibly backward. Even embedded systems want to have more than 4 GB of RAM on them, which is the maximum possible on 32-bit CPUs. On the mobile front, where ARM has had the most impact, Dan Lyons has a recent article about how Apple’s 64-bit A7 chip has mobile CPU vendors in a panic. Now, in order to compete with Apple, a handset maker wants a 64-bit chipset. Calxeda had a 64-bit CPU in the works, but it’s too far out to be useful in either market.

I’ve never really seen the point behind the “more smaller machines” movement, and I’m interpreting the end of Calxeda as evidence supporting my position. I’m sure there are specialized cases out there that make sense for these architectures, but the extreme limitations of the platform are just too much in the x64-dominated world of  IT. In the end, Calxeda focused too tightly on specific problems, and in doing so ignored both the larger problems of the enterprise and the changes in the computing landscape that ultimately made them irrelevant.

See You at the Show

VMworld2013.150pxSee you at the show! The time is almost here where several thousand people from around the world will make this year’s pilgrimage to San Francisco for VMworld 2013. This will be my last post before the start of the show and as people are packing and preparing to leave I wanted to share with you what I think is the way to get the most out of your time at the show. Continue reading See You at the Show

Where Does Virtualization Stop and the Cloud Start?

CloudComputingWhere does virtualization stop and the cloud start? I was reading a post that one of our analysts, Edward Haletky, posted and the very first sentence caught my eye and really got me thinking.

“I was going to write about how building a cloud is similar to moving, but the more I think about it, the more I think people are confusing an automated virtual environment with a cloud: IT as a Service is not just about cloud. Having automation does not imply your virtual environment is a cloud or visa versa.” Continue reading Where Does Virtualization Stop and the Cloud Start?

Some Thoughts on the Last Decade and 2011 in Review

I cannot believe the month of December is almost upon us.  Every year around this time I like to reflect upon the year and give my review and remarks.  This is a special year for me because it was around this time a decade ago that I was introduced to a cool new technology called virtualization from this neat new product called VMware Workstation. It was a magical moment when I first discovered the ability to run multiple operating systems, at the same time, on a single computer.  I remember this moment well because it was true love at first install.  Within a year I was playing with VMware ESX Server 1.5 and was given my first virtualization proof of concept that was followed by my first production design and deployment. The rest, as they say, is history as well as an amazing ride. On to 2011 in Review. Continue reading Some Thoughts on the Last Decade and 2011 in Review

Citrix forges ahead with a cloud services focus

Citrix’s annual Synergy conference held this week in San Francisco was kicked off with CEO Mark Templeton painting his view of the future, and the building and leveraging of cloud services. With the emergence and evolution of cloud services, Templeton believes that the industry has moved out of the PC (personal computing) era into a PC-3 era, incorporating personal, private, and public cloud services. Continue reading Citrix forges ahead with a cloud services focus