Software-defined storage (SDS) is about data services. Many think it is about automating storage. Yes, I can see that, but it is about what storage can deliver. So, what is the basis for SDS? There are four critical components. These components are analytics, augmentation, aggregation, and security. These four elements wrap storage to become data services. Data services and control thereof are therefore the key components of SDS. What data services can SDS provide that do not already exist? Is it just enough to add deduplication, or is more necessary? Let us look at these data services in detail.
Articles Tagged with analytics
Secure Agile Cloud Development takes Agile and DevOps to the next level. It is about code quality, based not just on what the developers test, but also on the application of continuous testing and on dynamic and static code analysis. Most importantly, it is about a repeatable and trackable process by which we can make code quality assessments. We can find out the “who did what, when, where, how, and why” of our code. It is a useful tool in incident response. Imagine a world in which our production environments are run entirely by code.
Let’s start the new year right with one of my current favorite topics for discussion: automation. In this article, I concentrate on the second-day operations type of automation. Second-day operations is quite a different beast from build and decommission automation, in that it incorporates several different approaches to automation.
After the Apollo 1 disaster, astronaut Frank Borman told Congress that the tragedy had not been caused by any one company or organization, but by the entirety of all those involved with the Mercury, Gemini, and Apollo missions. The problem had been a failure of imagination. They knew that at some point there would be a fire in a space capsule. However, they assumed it would take place in space somewhere. They just did not think about the possibility of fire while the capsule was still on earth. We call this failure of imagination “unknown unknowns” within the security world, but it boils down to the same thing. We just do not think about some things. Even with all the tools out there to help us, we have failures of imagination.
We all need performance and capacity management tools to fine tune our virtual and cloud environments, but we need them to do more than just tell us there may be problems. Instead, we need them to find root causes for problems, whether those problems are related to code, infrastructure, or security. The new brand of applications, if designed for the cloud à la Netflix, or older technologies instantiated within the cloud need more in order to tell us about their health. Into this breach comes a new set of tools, as well as an existing set of tools.
Have you noticed lately that the term “big data” is being used with increasing frequency? It seems that working with big data is one of the more desired and in-demand skill sets in the technology space. What you think “big data” is, and what do you think it represents? One definition to consider is this one from Wikipedia: “Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process the data within a tolerable elapsed time.” So, who benefits the most from its use? Have you stopped to consider just what makes up big data? Let’s explore that question a little deeper.