A major aspect of virtualizing any business critical application is data protection which encompasses not only backup, but disaster recovery, and business continuity. It is imperative that our data be protected. While this is true of all workloads, it becomes a bigger concern when virtualizing business critical applications. Not only do we need backups, but we need to protect the business, which is where business continuity comes into play. Continue reading Virtualizing Business Critical Applications: Data Protection
As I shoveled even more snow, I was starting to think about automation, as in how could I get something to shovel the snow for me, which lead to thinking about automation within the cloud. I see lots of discussion about automation in the cloud. Many of my friends and colleagues are developing code using Puppet, Chef, vCenter Orchestrator, etc. This development is about producing the software defined datacenter (SDDC). However, I see very little in the way of security automation associated with SDDC. Continue reading Security Automation = Good Security Practice
VMware, a company not known for establishing strategic partnerships with other software companies has just made a very significant move. VWware has invested $30M in Puppet Labs and has established a strategic partnership with Puppet Labs. The goal of the partnership is to allow customers to realize the value of Puppet across a variety of VMware products including vSphere, vFabric Application Director, Cloud Automation Center, vCenter Operations Manager and vCenter Configuration Manager. Continue reading News: VMware Invests $30M in Puppet Labs – Establishes Strategic Partnership
One of the great things about Splunk as both an Operations Management tool and as an Application Performance Management tool is the ease with which an astonishing variety of data sources can be fed into the Splunk data store. Splunk automatically indexes this data based upon time stamps, and stores it in a back end data store that scales out horizontally on commodity servers with commodity storage. This means that Splunk one of the very few management solutions that can scale out to accept the tsunami of management that is generated across the infrastructure and application stack in a modern dynamic or cloud based environment.
The Splunk Architecture
The wealth of data sources that can be collected an indexed by Splunk are shown in the left portion of the image below. The scaled out architecture that is how Splunk can keep up with the management data tsunami is shown in the rest of the diagram.
Now we come to the part about the good news and the bad news. The good news is that Splunk is able to be your management data store across your physical hardware, virtualization layer, operating system layer, application infrastructure layer (middleware) and the layer comprised of the applications themselves.
The bad news is that until today, if you wanted to pull all of the data together than pertained to a particular application, you had to be an expert in the topology of that application (where does it run), the virtual and physical infrastructure that supports that application (what is it dependent on) and on how to tie disparate data sources together in Splunk to create a cohesive view or dashboard. Organizations with a few (or one) mission critical application that was of such high value that it warranted a dedicated support team could easily justify the investment in learning required to pull this off. Organizations with thousands of business critical and performance critical applications saw this as an infinitely high cliff.
The Prelert Anomaly Detective for Splunk
The Prelert Anomaly Detective automatically learns the normal patterns of the Splunk data. It then automatically identifies anomalous behavior in the Splunk data and uses the ability of the Splunk Query Language to find cross-correlated data and events.
The Prelert Anomaly Detective allows for a significant advance in how customers use Splunk and its data. Today most customers use Splunk as a forensics tool to find the problem, after some other tool or user has reported the problem. The combination of the Prelert Anomaly Detective with Splunk allows Prelert to notify customers of anomalies that the customer did not even know to go look for and that can easily be leading indicators of problems that have not yet been reported.
The complete Prelert announcement is here – “Prelert Introduces Anomaly Detective, an Advanced Predictive Analytics Solution for Splunk Enterprise Environments“
We’ve discussed the fact that VDI appliance makers were making good progress simplifying adoption of a virtual desktop infrastructure. An appliance-based route to market can be seen as win-win: being designed both to reduce cost and complexity of implementation (for the customer) and shorten sales cycles (for the vendor). So goes the theory. To understand this theory further one VDI appliance vendor, Pivot3, commissioned Dimensional Research to survey global IT in order to get real-world insight into the state of VDI.
The survey showed that over 80% of respondents had VDI in their current strategy. Over 50% of those deploying VDI would utilize new hardware. What was perhaps more interesting was that traditional stall points of VDI, hardware complexity and security, took a back-seat in a list of concerns. The appliance model was undoubtedly popular, but if that problem is solved – what were the main concerns of organisations?
Cloud products and services are only in their infancy, but new and exciting technology is being released at an incredible rate. One example of something new is Kim Dotcom’s newly launched Mega cloud storage service with its free 50GB of storage. What really got my attention with this announcement was that the data would be stored encrypted; it is nice to see security being built into the offering from the beginning. There are a few bugs that are being reported, but hopefully it is the start of the push to secure the cloud.
With all the application and services that are available, does the average small business need the expense of physical infrastructure within their organization? I just had a meeting with a client, and we talked about consolidating their physical infrastructure as much as possible and then migrating what was left to the cloud. During our conversation, we broke down the different applications that were needed to run the business, to look at these applications separately. Continue reading Cloud Products and Services