Modern Infrastructure

The 2016 MI Impact Awards

Sashkin - Fotolia

When working with container technology, proceed with caution

Containers will require IT to rehash age-old questions about security and trust.

Have you heard the news? Container technology is here and will be the savior of IT! If you aren't familiar with...

containers they encapsulate applications in a way that hides operating systems and other pesky infrastructure layers. Application developers can check out a container image, install their application, and then deploy it repeatedly. With these techniques, developers can build, ship, and run any app, anywhere, according to Docker, arguably the largest container software vendor.

I'm being sarcastic about containers being our savior, though. I see a lot of parallels between Docker and Java. Java promised the ability to write an application once and run it anywhere. That was a bold promise, and Sun Microsystems (and now Oracle) almost completely failed to deliver on it. We are forever mired in Java version problems, forward and backward version incompatibility, platform incompatibilities, performance issues between platforms, security problems, and so on.

We see some of these same problems with containers. Despite the promise of abstraction from the underlying operating system, there still is an underlying operating system that needs care and feeding. In particular, it needs updates and patching, which developers rarely do. A majority of container images contain serious unmediated security issues, studies show. Furthermore, there are big trust issues in the container technology world. Is it okay that your developers are building applications on container images built by unknown people on the Internet? How do you know that those images are safe, and don't contain back doors or malware?

There are versioning problems, too, just like Java. There is different container software out there, such as Docker, Rocket, LXC, VMware ThinApp, Solaris Zones, etc., and it isn't uncommon for two different development teams to have chosen two different technologies. Each container technology has compatibility issues with underlying infrastructure, too. Developers need version X of their container technology but the operating systems my organization supports and secures aren't compatible, or require heavy retrofitting, which increases staff time commitments.

On top of this, there are very few management interfaces for containers. Chargeback/showback is unheard of. Security tools are nonexistent. Backup and restore isn't possible in the normal frameworks, either, which is a big problem not only for daily operations but also disaster recovery and business continuity. Change management is laughed at. Given all the holes in the process the pessimist in me starts thinking that containers are an elaborate way for developers to shirk the responsibilities of traditional IT, especially around risk management. And while it's clear that developers are eating the free lunch that containers promise, I often wonder who is paying for the meal, because it's a very expensive one.

So what do we do about it? For starters, we start asking all the same hard questions we've always asked. How are these things secured, and how do we prove it? How do we handle an incident with a container? Where is application data stored and how is it protected? Can we standardize all teams around one container platform? Who is building and maintaining "gold master" container images, and if it isn't our organization, how do we know we can trust them? How do our applications get security updates? How do containers mesh with our change management process, and how do we do capacity planning?

Because when all is said and done, there really no such thing as a free lunch.

Article 5 of 5

Dig Deeper on IT Ops Implications of Continuous Delivery

Get More Modern Infrastructure

Access to all of our back issues View All