The software-defined data center was all the rage at VMworld 2013, when NSX, VMware’s network virtualization platform, was announced. At VMworld 2014, the honeymoon of the NSX hype had worn off some, yet network virtualization is still a key growing and maturing technology. Nevertheless, it is still too immature for an all-out adoption at this point, in my opinion. I had an opportunity during the VMworld 2014 Tech Field Day Extra to sit in on a briefing and demo from Nuage Networks.
Articles Tagged with software defined data center
Come on, let’s get real here. The software-defined data center may become the norm in two years in the gilded cages of Silicon Valley, North Carolina’s Research Triangle, and the other “centers of excellence” out there. But in the real world—you know, the one where companies are still using NT4 servers to deliver real and useful work—surely this is not the case.
My response to Stephen Foskett’s tweet of a post about the Software-Defined Data Center (SDDC) Symposium led to an interesting conversation about the nature of the SDDC—what it is, what it is not, and why we should care. The software-defined data center is considered by some to be an instrument of vendor lock-in, vaporware, or in many ways just marketing hype. “SDDC” has many different definitions, but I do not believe it reflects any of those commonly used. Instead, I hold that it is a way of thinking, a way of looking at the new world of IT in which we live. This has sparked a quite an interesting Twitter conversation between many interested parties.
At the US VMworld 2013 conference, VMware did an excellent job of explaining how network virtualization and storage virtualization were going to work. Adding network virtualization and storage virtualization to the existing virtualization of compute (CPU and memory) along with API’s and policies to manage the whole thing is what creates a software defined data center.
VMware has announced its log management product – Log Insight. Log Insight is priced at $200 per monitored OS instance (per VM pricing) and is to be available in Q3 of this year. VMware’s own vSphere environment is the first targeted environment, and the two first use cases is Operations Management. Right now this is clearly a 1.0 offering competing with a very mature Splunk Enterprise offering – but there are some very interesting short term and long term dynamics at play.
Over the last few weeks I have been struggling with automating deployment and testing of virtual desktops for my own edification. This struggle has pointed out automation weaknesses which need to be addressed for automation and the software defined data center to succeed and to not only be deployed from software, but also to be self-healing and all the great things we associate with SDDC, SDN, etc. But current automation has some serious flaws and weaknesses. In essence, in order to automate something you must have a well known exact image from which to work.