End User Computing security seems to be in the hands of the users not actually the IT Security department. At least not yet. So what can we do about this? IT security can be draconian and not allow EUC devices into the office, but the users will be up in arms. They use their smart…
The next evolution of virtualization is the Software Defined Data Center or SDDC and it is quickly becoming the next logical step in the continued evolution of cloud technology that will give you the ability to run legacy enterprise applications as well as the other cloud services. In my opinion you could also define Software Defined Data Center as a converged datacenter so to speak. My friend and colleague, Edward Haletky wrote a great post on SDDC and data protection, which raised this question. How the heck to we recover SDDC?
One aspect of SDDC that does not get a lot of attention is Data Protection, instead we are concentrating on SDN and automation. Yet, this leads me to Data Protection. There is a clear marriage between Data Protection and SDDC that needs to be added to any architecture. As with all things, we start with the architecture. Our SDDC architecture should also include data protection, but what data are we really protecting? Within SDDC there are three forms of data: tenant, configuration, and automation. Without one or the other, we may not be able to reload our SDDC during a disaster. What is required to get these three types of data, what really are these types of data? and how can we add data protection into SDDC cleanly?
SDDC Operations Management is going to require a new approach. Vendors with effective Operations Management solutions for today’s virtualized data centers are in the best position to be able to expand their offerings for the SDDC. Legacy vendors face a complete rewrite of their products and the adoption of a new business model (easy to try and easy to buy) that will destroy them financially, and will therefore be unable to react to the SDDC either technically or financially.
• • 2 Comments
The SDDC and the Cloud are going to require a new SDDC Management Stack that will need to be based upon a multi-vendor big data datastore. There will likely be on-premise and cloud hosted version of these datastores. Splunk, VMware, New Relic, The Pivotal Initiative, CloudPhysics, AppNeta, and Boundary are all excellent hypothetical suppliers of such a datastore.
Soon the backup power will be available for our new datacenter and the redesign to make use of VMware vCloud Suite is nearing completion. Soon, our full private cloud will be ready for our existing workloads. These workloads however now run within a XenServer based public cloud. So the question is, do we stay in…
I was going to write about how building a cloud is similar to moving, but the more I think about it, the more I think people are confusing an automated virtual environment with a cloud: IT as a Service is not just about cloud. Having automation does not imply your virtual environment is a cloud or visa versa. Granted, using IT as a Service is important for a cloud if you look at the NIST definition of a cloud, but it is not necessary for a cloud. Perhaps IT as a Service is just a stepping stone towards a cloud, perhaps it should start as a data center play?