ltstudiooo - Fotolia
As the container craze continues to gain popularity, admins are starting to look deeper.
Container management tools have become a hot topic, but how necessary are they? It depends on who you ask.
Built.io is a service provider that offers a back end for organizations developing mobile applications. It also allows customers to upload custom code to its service in the form of a Docker container.
Built.io started working with Docker back in the pre-historic days of 2013, when there were very few if any management tools available for the platform. As such, it created its own API-based management layer that performs tasks such as starting and stopping containers and restarting them in the event of a failure.
But the ephemeral nature of the containers that Built.io runs also limits its need for elaborate workload orchestration and placement tools such as Google Kubernetes or Mesosphere, said Nishant Patel, Built.io CTO. Instead, the team simply monitors the queue of containers that it needs to process, and if the queue fills up, it launches another cluster on AWS. "Scaling up and down is pretty easy," said Patel.
But at GE Appliances in Louisville, KY, a good container management system is paramount to container success. Last year, the firm developed a self-service test and dev private cloud based on Docker and Mesosphere, to improve the time between a developer submitting code, and getting it up and running. An initial infrastructure as a service implementation whittled that process down from six weeks to an underwhelming three.
Container management grows up
While container management tools used to lack features, the rapid growth of container technologypushed management forward. Check out container options, management tools and varying levels ofsecurity for a wide variety of container products in this guide.
The old system, "had an atrocious rate of adoption," said Brett Luckabaugh, GE Appliances enterprise software architect. "The barrier to entry was just too high for a lot of developers," for instance, requiring them to learn Puppet to automate infrastructure builds. "It's not like in the hyperscale market where they have thousands of nodes and a few apps. For us it's just the opposite -- we have thousands of apps on a few nodes."
The combination of Docker plus Mesosphere has largely been a winning one for GE. Docker provides high portability, and developers love it. "If you can use a shell, you can grasp a Dockerfile," Luckabaugh said. Mesosphere, meanwhile, provides fast deployments and scheduling of tasks, scaling, management of containers, self-healing/fault tolerance, and overall simplification of datacenter management. A year in to the project, GE was running 350 applications across 800 containers with Docker, with many more in the pipeline.
But this is by no means a done deal. For instance, GE Appliances uses a scheduler called Marathon in conjunction with Mesos resource management, and integration with Docker is still nascent. That led the team to go forward with building its own system, an internal Web app it calls Voyager that provides automated Docker builds, service discovery and load balancing, plus a user interface and API access. Going forward, the team will also keep its eye on orchestration and management tools from Docker proper, to see what value they bring to the table.
Mind the gap
Enterprise IT shops are interested in containers, but there's a lot that needs to happen before they adopt them whole-hog, said Andi Mann, business technology strategist at Sageable, an independent technology consulting firm.
"I'm hearing a lot of enterprises talk about whether they should or shouldn't adopt containers, which usually means that they will," said Mann. We're already seeing a lot of traction in agile development environments, where there are strong processes in place for open source tools, but enterprise-wide adoption is a different story, and "the difference is the management layer."
"Sure, there are a lot of orchestration tools, for example, but we're still missing a lot of the more mundane management features such as test automation, provisioning, security and performance monitoring," Mann said.
Application vs. operating system containers
Operating system containers have a history that goes back decades, but application containers à la Docker are a relatively new phenomenon. Operating system containers such as Solaris Zones, BSD Jails and Linux LXC share a single kernel, but each container can run multiple processes and services. "With [LXC], you can run an entire OS," said Dustin Kirkland, Ubuntu product manager at Canonical; "we boot the init system" -- Linux-speak for the first process to run once the kernel is loaded. Application containers, meanwhile, run a single process per container, and interaction with the underlying operating system and kernel is handled by an intermediary such as Docker Engine, which is responsible for building, distributing and running application containers.
Take container monitoring, for example. Getting information and performance metrics about the container itself has been made easier as of late with the introduction of the Docker Stats API, said Raj Sabhlok, president at ManageEngine, a management software vendor, but there's still a ways to go. "In order for operations teams to feel totally confident with the manageability of containers, container management data will have to be correlated with the underlying Linux operating system and the application itself," he said.
And while the open source community has put a lot of effort into developing container technology, enterprises are still wary of new open source projects with dubious longevity, said Mann. "There are so many one-off management tools that get written and then are left by the wayside."
Indeed, the whole field of containers is so nascent, it's hard to predict how it will play out in the enterprise, said Canonical's Kirkland.
"At a large scale, it absolutely makes sense," he said -- the Netflixes and Googles of the world "absolutely need that stack." But for others, "it's a slippery slope."
"You have a 25-year old developer in the Valley sitting at a Starbucks running Docker on his laptop. [With Docker], he writes a piece of code, and the next thing you know, he's moved it in to production at Amazon. He tells his supervisor, and the next thing you know, they're putting everything in Docker," Kirkland said.
But that strategy, while easy, isn't necessarily the way to go, he said. "It may take a CTO to take a step back and ask -- 'Are the problems I really have solved by this solution?'" The answer may be yes, but then again, it may not be.
About the author:
Alex Barrett is editor in chief of Modern Infrastructure. Email her at firstname.lastname@example.org.