In the last 12 months, Docker and the containerization paradigm has challenged the foundation of some of the software delivery principles. People are resorting to things that were unheard of 18 months back and are now using them to successfully deliver software faster and cheaper. One such thing is the debate around build-driven delivery vs image-driven delivery.
Over the last few decades, software has been packaged and built by a build system and tagged with a number; testers aligned their testing to the tag as the entity being tested and promoted them until it reached production. The build number was one of the most important tag for anyone to communicate about the status of the software in the delivery pipeline.
For some time now—primarily due to Docker—several teams are moved to a methodology in which the image defines the version of a software. Every change in application program and deployment program will together create a new image, which is tagged and used as the version of the software to be launched for downstream use and finally into production.
The debate then is, which option is better?
Let me take a shot at this, and will invite your thoughts on this as well.
For builds-driven delivery:
- Builds helps software to be tagged irrespective of the target hardware and other deployment options. It bundles all dependencies to ensure the software works when deployed to the expectation. This provides the development team a good hand-off point to move on to the next set of work, or sprint.
- Builds are much smaller in size, as the build bundles only necessary software needed by the application; hence, keeping them aligned with every change will not be a costly storage proposition.
- Versioning of artifacts and bundling them at run time has given tools such as maven to build and link dynamically, collecting all dependencies from the cloud. This makes it easier to keep a thin build and further reduce the redundant data to be kept in each build.
- Build repositories have matured over the years to provide a very easy way to version and mark dependencies so deployment tools can find new versions and automatically upgrade environments as needed.
For images-driven delivery:
- Images takes the concept of build to the next level and ensures the variability in target hardware and deployment options are removed completely. It bundles the application as given in the build or built as part of the image creation and pushes the image as a build.
- Docker registries have started to compare and contrast themselves into a build repository by providing similar options for end users.
- The images-driven option provides a much better benefit of standardizing all your environments, removing the question,”It works in my environment; does it in yours?” If it works in the container in a developer’s laptop, it should work in production as well—everything is same, except for the size and number of resources.
- Images are much larger than builds, given that it has to now carry the system software and the application base software within the image. This option was unheard of earlier, but with container images coming down to megabytes, this is not such a big issue today. With layering sizes can be reduced much further as you start peeling the onion of dependencies. Take a look at this nicearticle by Brian on this topic.
I guess the choice depends on where your application is in its life cycle, the questions teams should really ask are:
a) Do we need to change too much to pick an option for an incremental benefit?
b) Do I have the issue of environmental differences?
c) Can I merge these options to get a hybrid solution going?
But for me the larger question is, Which option would you choose, if you were starting from scratch tomorrow? I would love to hear your thoughts.
Finally, irrespective of which way you lean, image-driven solutions is one of the best way to bring DevOps culture to a team: If the image works, the team has won; it doesn’t matter whether it was ops code or dev code!