Our data protection analysis focuses on the ever-growing list of requirements of the modern data center and hybrid cloud. We look for those requirements that address the future, not the past, and classify them into major categories, adding categories as needed. The categories are then graphed to form the basis of our coverage reports.
Articles Tagged with Zerto
As we all know, data protection is not really about how we back up or replicate data. Instead, it is about how we recover our data. Recovery is not just about a disaster; it is also about files and continual testing. Data protection must not be “set and forget.” Our ever-changing hybrid cloud environments require proactive data protection. We need to detect changes to applications. We need software that adjusts backup or replication to pull in more and more of the application. In essence, data protection should not require a human to be involved. Where are we in relation to this goal?
Innovation is a critical part of any business, particularly a software business. However, as we know from Clayton M. Christensen’s book The Innovator’s Dilemma, it is hard to innovate in a large company. The challenge is that many innovations will disrupt the existing revenue stream. But without innovation, the revenue stream will inevitably end. To remain a viable business, innovation needs to be fostered and adopted, even at the risk of short-term self-disruption. One way a growing company can remain innovative is by encouraging engineering teams to innovate through hackathons. A hackathon is a short period, usually twenty-four hours, during which a group of developers collaborates to write some software very fast. The aim is a high-energy drive to proof out ideas or build a rapid prototype. The events usually run on a diet of caffeine and pizza. The hackathon participants each bring their own ideas, and the group together decides which ideas to pursue. The developers form their own small, temporary teams to work on their chosen ideas. At the end of the hackathon, each team reports to the whole group on its idea and the progress it was able to make. This type of brief but intense activity is invigorating for the creative side of software development. Participants typically work all night with few breaks in order to build as much of the idea as possible. This rapid development of a new idea is usually a welcome break from the normal software development processes of bug fixing and QA testing.
Recently at Dell World, I was part of a conversation about what would be utopian disaster recovery and where we are today in the state of the industry. But where we are today is transforming, with a new name that encompasses many technologies. We are now using the term “data protection” (DP) to mean much more than just disaster recovery (DR), backup, business continuity (BC), replication, data loss prevention, and replication, but also the basic functions of confidentiality, such as encryption. The main goal of data protection is to provide a way to use your data as quickly as possible wherever it is needed and with minimal or no loss.
In many cases, when we mention Data Protection for the Hybrid Cloud, we are usually talking about backing up to the cloud. The cloud becomes a repository of our backup images and in some cases those backup images can be launched within clouds that use the same technology. Being able to send data to the cloud is becoming table stakes for infrastructure as a service (IaaS) data protection. However, once we move outside the realm of IaaS to Platform or Software as a Service (PaaS or SaaS), data protection is hit or miss.
One aspect of SDDC that does not get a lot of attention is Data Protection, instead we are concentrating on SDN and automation. Yet, this leads me to Data Protection. There is a clear marriage between Data Protection and SDDC that needs to be added to any architecture. As with all things, we start with the architecture. Our SDDC architecture should also include data protection, but what data are we really protecting? Within SDDC there are three forms of data: tenant, configuration, and automation. Without one or the other, we may not be able to reload our SDDC during a disaster. What is required to get these three types of data, what really are these types of data? and how can we add data protection into SDDC cleanly?