There has been a lot of discussion recently about whether forking Docker makes sense. Driving this discussion are complaints from the Docker community and ecosystem about the speed at which Docker is releasing software and the perceived quality of those releases. Unless you have been hiding under a rock lately, you know that Docker is one of the most popular open-source projects in the world. Docker’s rise from a concept to a dominant force in the industry is a story for the ages. As Docker and containers continue to gain adoption in both non-production and production environments, vendors have been flocking to provide services that support or enhance Docker containers.
Docker’s Approach to Shipping Product
Docker’s approach to building and releasing software follows many of the principles of the lean startup methodology. The lean approach is to create a hypothesis, build the minimal amount of code required to support that hypothesis (often known as minimal viable product, or MVP), release the code to the marketplace, get feedback, and quickly iterate to improve the product.
Traditional Approach to Shipping Product
The traditional software vendor approach is to work really hard to build the most stable, hardened solution possible, because customers expect products and services to always work. This is what started the conversations about forking the Docker project. Vendors who are dependent on Docker code to provide stable solutions for their customers want Docker to focus more on stability and quality and less on innovation. Meanwhile, Docker is dominating the container space because its rate of innovation far exceeds anything many in the industry have ever seen before.
One solution that is being talked about in the forums has somehow bubbled up to become a major news story on social media and many media properties. This solution is to fork Docker so other groups can have more control over the feature set and stability of the releases. The irony of this solution is that most vendors building on Docker Engine are running on top of Docker 1.10 or 1.11 while they are complaining about the stability of 1.12. Instead of forking, which creates its own complexities, why don’t these companies just support the release of Docker Engine that they feel is stable enough to run their business on?
How Do We Move Forward?
Docker’s rapid pace of innovation has caused some members of the ecosystem to declare that Docker should slow down and innovate less. I understand that view from the standpoint of a vendor trying to build on top of Docker. But slowing down is a legacy mindset. Innovation is occurring in all phases of all businesses across all industries at rates never seen before in our lifetime. Companies should embrace this new pace of innovation and adapt their approach to embrace it, not stall it. Slowing down or doing things the way we always have done them before is a recipe for becoming extinct.
Here is an analogy I like to use. I have been in this industry since the early 1980s, and IT has always struggled to deal with the rate at which the business has asked for changes. I can’t count how many meetings I have been in over the years in which IT people brainstormed on ways to slow down their business partners so that IT’s job of building stable solutions could be easier. The problem is that slowing down the business may be helpful for IT, but it does not help the business. In fact, this has led to many business-driven solutions to work around IT to achieve the velocity that the business needs to meet its goals. Solutions like outsourcing, buying solutions without consulting IT, building development shops within the business, shadow IT, and others are all a result of slowing down the business.
In these meetings, I always recommend that we should look at the business’s rate of change as a fixed constraint. Instead of trying to change the constraint, we should reevaluate how we build software to deal with the new reality that the business wants stuff faster than we can service it with our current processes. This has led to embracing concepts of agile, lean, and DevOps over the years.
I think the container industry should apply this same approach. The new constraint is that Docker is innovating at a pace that we have never seen before. The releases may not meet the stability requirements that we have been used to for the last twenty years. Instead of telling Docker to innovate more slowly, why not reevaluate our processes for dealing with this new constraint?
The whole Docker fork discussion is a one-off solution that addresses the symptoms of a problem and not the root cause. We all know that addressing symptoms and not fixing the root cause is a major driver of technical debt. It also introduces waste into the value stream. So why not address the root cause? The overarching issue is that Docker is moving at a pace that many people within the ecosystem can’t keep up with. The second issue is that the ecosystem wants a higher degree of stability. My recommendation is to treat those two issues as the new norm and reevaluate how to run your businesses with these known constraints. One solution might be to hold off support of the latest version until it is deemed reliable enough for your business model. I am sure that people who live within the ecosystem can come up with other solutions as well.
My point is that slowing down innovation is a losing proposition. I see this within enterprises in my daily work. Enterprises are so risk averse and so resistant to change and innovation that the competitors who embrace rapid innovation are eating their lunch. The same is going to happen within the container ecosystem. Those playing by the old rules will eventually lose to those who work within the new constraints.