SAN FRANCISCO -- DevOps deployment pipelines are a challenge to build, even if technology could be frozen in time, but that's never the case in the IT industry.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
From VMs to containers to immutable infrastructures and complex clustered orchestration tools, technology changes ever faster as enterprises create DevOps deployment pipelines. New technologies such as containers and microservices architecture will pave the way for flexibility in the future, according to large enterprises at this week's DevOps Enterprise Summit.
"We will never be able to say that we are done," said Topo Pal, a director at Capital One. "A pipeline is not one single tool; it's a set of many tools put together, sort of like microservices."
Linking together multiple open source software components through APIs that remain largely separate from one another is the key to maintaining flexibility, Pal said.
"We can actually switch out Jenkins and put something else in place of Jenkins," he said. "We can switch out our testing tools. We can go all the way to whether we want to bake our Amazon Machine Image or do Docker, since we don't know what's coming down the line in the next six months."
At Hyatt Hotels, the IT team has built out a container platform based on the recently introduced Docker Swarm Mode to take on new programming languages as they come down the pike.
Ray KruegerVP of engineering, Hyatt Hotels
"The container platform is built entirely around the model of operating the platform and operating the containers without having to actually concern ourselves with what's running inside them," said Ray Krueger, VP of engineering at Hyatt.
"We've abstracted away the operating environment, we've abstracted away the concerns around how we run software, and we now truly have the flexibility to start doing all kinds of things while having the exact same level of support for the platform," Krueger said.
Of course, containers come with their own learning curve and mentality changes in a legacy environment. Speakers here frequently discussed the need to pay down technical debt to continue to innovate.
It's about "leaving space to make sure that it's not about just getting new features out, but having the flexibility to reprioritize and allow us to get the best value out [of new tech]," said Opal Perry, divisional CIO for claims at Allstate Insurance Co.
Continuous monitoring and tuning is also critical to maintain a DevOps deployment pipeline's effectiveness while slotting in new technologies, she said.
"Walk before you run" has been the approach at Hyatt with adopting containers.
"Our operations team is used to running VMs and putting load balancers in front of them, so rather than disrupt that, we wanted to get really good at running containers first," Krueger said. "We're using containers to package and distribute and deploy software to VMs. … We just build cookie-cutter VMs that have the Docker Engine, Nagios and Splunk on them."
Docker Swarm is running in a preproduction environment currently, and will be developed further in 2017, Krueger said.
It's also important to break the infrastructure down to its irreducible components, such as data structure, said Robin Yeman, an Agile leader at Lockheed Martin Corporation, based in Bethesda, Md., to keep it as consistent as possible as new technologies come into play.
"If I really focus on the data I need underneath and the metrics associated [with it], then, quite honestly, I can sit different tools on top of it all the time," Yeman said in a presentation. "I don't have to dictate your user interface, and I can still keep the teams moving and being successful."
What skills are needed to be an effective DevOps engineer?
Where DevOps, microservices and containers intersect
DevOps in mind as vendors look at storage in containers