In order to sell the Software Defined Data Center, VMware needs to prove that it delivers a hard dollar ROI to the customers of the Software Defined Data Center. Commoditizing the expensive networking hardware business and commoditizing the expensive storage hardware business are two excellent ways for VMware to deliver that hard dollar ROI, and for VMware to be able to justify its price premium over competitive offerings.
At VMworld 2013 and on the Virtualization Security Podcast there were many conversations about VMware NSX. These conversations ranged from how will we implement this new technology to security, scale, and other technical questions. In addition, NSX and what was needed to make it a reality may be the answer to a nagging security question. Brad Hedlund, from the VMware NSX team, joined the Virtualization Security Podcast to share with us some of the details around VMware NSX prior to the podcast.
When we look at the secure hybrid cloud, there seems to be a missing piece, a piece that is used to validate identity via the role based access control assigned to applications, data, and systems allowed to access that is dynamic instead of normal static firewall rules that are either port or vm-centric. The software defined data center needs security to move with it and not remain static. Yes we could manipulate the rules on the fly, but those manipulations require that we know who is using a particular VM at a given time and in the case of a server, the VM could be used by more than one user at a time, so we need something more dynamic. Privileged access to data needs to be enforced throughout the stack and not just within an application or by encrypting data. This is a key component of the software defined data center.
VMworld 2013 is upon us and one of our tasks is to figure out which vendor’s booths to go see. With over 230 booths to choose from this is a daunting task. If you are interested in finding creative new solutions to your management, monitoring, deployment, security, data protection, and desktop management problems, this list will help you.
When we think of logging within the secure hybrid cloud, we tend to think of analytics, but there is more to logging than just reviewing the data there are also discussions on what to collect and from where as well as why collect the data? For security purposes we may start with collecting access data and work out from there, but most logs from complex systems such as a secure hybrid cloud include many different forms of log data and in some cases, not enough. Perhaps what log data you can retrieve may be a deciding point for hybrid cloud services as logs are used not only for audit purposes, but also for trouble shooting and forensics. What log data do you collect within your secure hybrid cloud?
The Hybrid Cloud has 100s if not 1000s of APIs in use at any time. API security therefore becomes a crucial part of any hybrid cloud environment. There are only so many ways to secure an API, we can limit its access, check the commands, encrypt the data transfer, employ API level role based access controls, ensure we use strong authentication, etc. However, it mostly boils down to depending on the API itself to be secure because while we can do many things on the front end, there is a chance that once the commands and actions reach the other end (cloud or datacenter) that the security could be suspect. So how do we implement API security within the hybrid cloud today?
The recent events surrounding the treacherous activities of Edward Snowden should make most of us think long and hard about the measures we are taking to secure our corporate data. Are we giving our administrators too much access? Do we fail to audit and report on how the data is being accessed and used? Is our data just too mobile? Unfortunately the answer to all three of these is yes.
At the recent Misti Big Data Security conference many forms of securing big data were discussed from encrypting the entire big data pool to just encrypting the critical bits of data within the pool. On several of the talks there was general discussion on securing Hadoop as well as access to the pool of data. These security measures include RBAC, encryption of data in motion between hadoop nodes as well as tokenization or encryption on ingest of data. What was missing was greater control of who can access specific data once that data was in the pool. How could role based access controls by datum be put into effect? Why would such advanced security be necessary?
• • 4 Comments
Legacy management frameworks are going to get replaced by SDDC Management Platforms that combine big data back ends, analytics and ecosystem friendly data collection and integration strategies to give customers the best of both worlds. Customers will be able to choose from among best of breed solutions, and then integrate them at the data level via a big data back end data store. This will revolutionize the management software industry, give rise to a new set of leaders in this industry, and completely destroy the legacy management frameworks.