This is a contributed piece by Micha Hernandez van Leuffen, founder and CEO of Wercker
I’m talking about software containers, a technology that has grown increasingly popular since Docker burst onto the scene in 2013.
A huge ecosystem of companies has sprung up around software containers, from Docker, which helped standardise container formats, to container scheduling and orchestration tools such as Kubernetes from Google, Swarm from Docker, and CoreOS. As with any new technology trend, there are ongoing discussions around whether containers really are worth the hype that surrounds them, so it’s worth taking a step back to look at how the new technology has become so popular.
So what are containers?
Simply put, containers are a method of software abstraction for developers. Software abstraction provides developers with more horsepower and flexibility. Back in the day, developers were working with ‘bare metal’ - that is, designing software with specific hardware capacities in mind. Clearly this limited capacity, so developers started using virtual machines.
Virtual machines (VMs) allow developers to abstract their work away from hardware by utilising software, called a hypervisor, that emulates hardware capabilities such as CPU, storage, memory and networking, allowing more software tasks to be run simultaneously on multiple VMs per physical machine. Think of containers as lightweight versions of virtual machines: but instead of a hypervisor the virtualisation is done through the underlying operating system (OS-level virtualisation).
Containers don’t need to make a virtual copy of the host server’s hardware features, and they also don’t need a full copy of the host operating system to be installed within the container. This enables containers to be orders of magnitude more lightweight and flexible, which in turn means that developers can fit far more containers on a single server than they would virtual machines. OS-level virtualisation also allows for rapid creation of containers. The end result? Developers have more firepower to play with.
Containers should run everywhere, from a developer’s laptop to a production cluster running on Amazon Web Services, for example. The technology therefore helps the problem of portability, allowing developers to work on ever more complex applications.
Containerising software should also increase the security and reliability of software. When software is built using the old ‘monolithic’ model, if an error is found developers will have to spend a great deal of time analysing the application, identifying the source of the issue, and fixing it without breaking any dependencies. If that same software is containerised, that means that the problem can be quickly isolated to an individual container, which can be removed, fixed and reinserted into the application with minimal loss of service.
This makes containers a perfect vehicle for microservices, which allow developers to split their applications into smaller components. All of this adds to a compelling proposition and a clear solution for the modern cloud and cloud-native applications.
Who is using containers right now, and what does the future look like for this technology? Containers have been the object of significant investigation and investment over the past two years. For example, in 2015 Goldman Sachs, which employs over ten thousand software engineers, invested $95m in Docker and plans to move 95% of their workloads on to the platform over a two-year period.
Other high profile container adopters include Oracle, which last year acquired StackEngine to bolster its cloud services offering. More recently, tech giant Cisco announced that it plans to acquire ContainerX. Other Docker supporters include Amazon Web Services, Google Cloud and Microsoft Azure.
What’s more, the long list of adopters continues to grow. A recent study showed that an astonishing 81% of organisations are planning to increase their investment in container technology.
More and more organisations, from tiny startups to enterprise level organisations, are looking around for ways to move faster, release software quicker, and ensure that their (often very expensive) developer staff are as productive as possible. While container technology is certainly the recipient of a great deal of hype in recent months, it’s also clear that it does actually go a long way to resolving these issues.
Despite the perceived technical complexity, containers are actually solving a very human problem: how can we make increasingly complex software architectures more modular, and thus more manageable? How can we make developers’ lives easier and more productive as they’re asked to create exponentially more complex software services?
That was precisely the issue that I set out to solve with my company Wercker. We have developed a full automation platform that helps developers build, dev and deploy multi-service cloud-native applications. We have seen the development of containers first hand, and we’re already seeing some huge global companies using containers in production.
Because of these fundamentals, the future for containers is bright, and we’re incredibly excited about what’s next, particularly as we edge closer to dev/prod parity and immutable software releases. ‘Automating all the things’ has been a meme-worthy cliché for some time, but with containers we can finally get there.
PREVIOUS ARTICLE«Mobile Broadband Forum: ‘Chat’ between things becomes a focus point
Mark Chillingworth on IT leadership
Phil Muncaster reports on China and beyond
Kathryn Cave looks at the big trends in tech