I was in this industry when the IBM PC was launched in the early 1980’s. As soon as the first model was replaced with one that had a 5 MB (imagine that) hard disk, and the first personal productivity applications came out (WordPerfect, VisiCalc, and Harvard Graphics) the IBM PC morphed from a toy to a serious tool that users and businesses could use to enhance their productivity through computing without having to plead for resources and support from what was then a mainframe and mini-computer dominated data center.
End user computing was born (at least in North America and in some parts of the world – Northern Europe (perhaps wisely) never embraced the concept), and as a result end user computing became a huge market that gave birth to many PC vendors, Microsoft, Novel, and a large number of supporting hardware and software vendors. For the first heady years of this revolution it was all about end user productivity with no thought being given to the management and support consequences of end users and end user departments doing their own technology acquisition and their own management and support of this technology.
In the early 1990’s departmental client server occurred around the idea of Windows clients and Windows NT Servers running SQL Server and applications developed by departmental development tools like Visual Basic. An entire generation of tactical applications were written and deployed in large and small companies alike (many of which are viewed as painful legacy applications today). When the Internet happened a few years later, again end users and departments in enterprises built a large number of tactical applications around tools like ASP and SQL server, many of which are again considered another layer of legacy applications that have to be supported. All of this ushered in a huge amount of end user, departmental and business productivity, but by far the vast majority of it occurred with no consideration of how the resulting stack of technology was going to be managed (and who should manage it) over time. These trends ultimately lead to the server proliferation and sprawl that VMware has addressed with its first generation offerings.
So here we are in 2009 having gone through several waves of technology adoption, and we stand upon the precipice of Cloud Computing. There are many people who think of external clouds as a natural and seamless extension of internal clouds and that workloads are going to be shifted from the internal cloud to the external cloud (cloud bursting) dictated by demand and capacity. In other words, the idea is the existing IT hosted applications are going to be shifted to external clouds for execution at least some of the time. I see huge problems with this use case for Cloud Computing, but those problems are not the subject of this article. Rather the subject of this article is an entirely different use case.
I think the early uses of Cloud Computing by enterprises (or any company with a significant IT department) will be again by users, departments, and business constituents who have some of their own reasonable level of technical expertise and who will find that they can bring up “tactical” or prototype business applications on external clouds much more easily than they can in the internal labs (even the ones based upon virtualization) offered by their internal IT departments. Cloud Computing offers these business constituents the opportunity to experiment with technology solutions that might fit various business cases and scenarios without having ask, explain, fill out forms, and wait for a response from IT. External Cloud vendors are structuring their entire businesses around being easy to work with on a technical and business level. Therefore, I think that Cloud Computing will be the basis of the next big end run around IT and IT’s structure and processes.
If this happens as I project then this creates two huge challenges for IT:
- Right after the first five or ten tactical applications get deployed for prototype purposes on an external cloud, one of them will get baked enough in the mind of the business in order for the business to want to deploy it to all or a significant portion of the business. Then IT will be faced with the challenge of having to import that application (and its guests) into it production environment, with IT having had nothing to do with how those guests were originally set up and provisioned.
- Even assuming IT can technically import these guests from the external cloud, that will still leave the issue as to whether IT can compete on a cost basis with the external cloud vendor for the cost of operating the application on an ongoing basis.
Smart IT departments will put in place tools from vendors that deliver comprehensive management of the virtualization life cycle, control of guest content, portability of guests across various virtualization platforms and cloud platforms, and federated self-service with business constituents.
Login to download the”;
} ?> [dm]1[/dm]
Share this Article:
Latest posts by Bernd Harzog (see all)
- VMware vSphere 6 Attacks Amazon with “One Cloud, Any Application” - February 9, 2015
- VMware vSphere 6 Attacks Red Hat: VMware Integrated OpenStack - February 3, 2015
- Will the Public Cloud Be the Next Legacy Platform? - January 20, 2015