Fotolia

Embrace DevOps innovation to modernize legacy applications

To bring legacy applications into the DevOps fold, IT teams must be creative and collaborate in ways that may differ sharply from approaches to greenfield projects.

DevOps isn't just for greenfield applications, and the way some enterprises modernize their older applications could actually put them ahead of the industry's IT transformation curve.

For established enterprises that date back more than a century, such as Boston-based John Hancock, a different team manages the process to modernize legacy applications than the one that handles cloud-native apps, and both teams face different constraints.

This is also true for newer companies, such as Carfax, an automotive e-commerce service provider based in Centreville, Va., which was founded in 1984. Carfax must still teach old tools new tricks to streamline data management for legacy apps.

"I call our legacy DevOps philosophy the Statue of Liberty approach," said Kurt Straube, systems director at John Hancock, who leads the IT teams responsible for the company's legacy application modernization. "On the Statue of Liberty, it says, 'Bring me your tired, your poor ...' And we say, 'Bring me your old, unsupported, third-party-based legacy apps, and we will modernize them in short order, for short money.'"

While the apps themselves are old and even unsupported by vendors, such as those based on Microsoft's .NET 3.5, their modernization requires innovative DevOps tools that support stateful and data-intensive applications. Here, Straube's teams experiment with smaller vendors whose tools may not have widespread use, balance investment in a modernization project against a legacy app's worth, and work within numerous regulatory compliance and security constraints.

Those restrictions mean Straube can't simply integrate with the same DevOps infrastructure automation platforms used for cloud-native apps, such as the Pivotal Container Service.

"You know what it would cost to convert our legacy apps over to Pivotal and how long it would take?" Straube said. "You'd probably be looking at a billion dollars, and it's not nearly worth that, so we don't build a conventional pipeline."

Back to the DevOps future with legacy apps

Legacy Microsoft SQL Server databases are a microcosm of the overall approach to modernize legacy applications at John Hancock, Straube said. The company uses WinDocks' database containerization tools to migrate legacy database systems -- some more than 10 years old -- to Microsoft Azure.

John Hancock DevOps evolution
A slide shows the evolution of DevOps for legacy systems at John Hancock.

So far, that work shows promise. John Hancock's test data management team has used WinDocks to quickly spin up containerized SQL Server databases along with masked versions of their data -- the first step toward the creation of self-service automation for the legacy SQL Server test environment. WinDocks' price also beat out similar tools from competitors such as Redgate, as well as more elaborate container management platforms, such as Red Hat OpenShift.

However, Straube's team must negotiate compliance and security constraints before WinDocks becomes part of the day-to-day DevOps process for legacy SQL Server applications. Straube's team just learned it needs to have a firewall rule for every developer laptop, which sent them back to the drawing board on WinDocks integration.

"It would be a whole different story [in an organization] with no rules or regulations, but we're anything but that," Straube said. "I earn a living by finding creative ways to get things done in a highly regulated, constrained environment ... that's what I do."

Despite the constraints on its approach to modernize legacy applications, Straube's team has incorporated 128 legacy applications into its continuous integration pipelines for test and development. Developers made 1,350 deployments of those applications to production and more than 30,000 deployments to nonproduction environments in the second quarter of 2018 alone. His organization also has broad latitude to choose the CI/CD tools best suited to each application, while XebiaLabs and Tasktop tools optimize their pipelines' performance and tie them together.

I call our legacy DevOps philosophy the Statue of Liberty approach ... 'Bring me your old, unsupported, third-party-based legacy apps, and we will modernize them in short order, for short money.'
Kurt Straubesystems director, John Hancock

"Depending on how legacy your [app] is, you're actually going to be at the bleeding edge with some of this stuff," Straube said.

Some vendors may do IBM DB2 management with DevOps products, for example, but may be new to the space, and their tools may not always work as anticipated. "You have to fail fast," he added.

DevOps is all about collaboration, but Straube said he doubts that DevOps pipelines for legacy and cloud-native apps will ever merge; the systems that service them are already too different, and he said he doubts legacy apps will be automatically deployed to production. But, eventually, today's greenfield apps will become legacy, and Straube's teams will be ready to support them.

In the meantime, Straube advised other enterprise IT pros who plan approaches to modernize legacy applications to get buy-in from upper management, and then "be prepared to roll up your sleeves and do stuff that's not so sexy, the stuff no one else is looking at," he said.

John Hancock XebiaLabs DR
XebiaLabs' XL Release tool modernized disaster recovery for John Hancock legacy apps.

Legacy tool interface update modernizes Carfax data management

For companies with technical debt, their approach to modernize legacy apps may also require changes to DevOps pipeline tools and those who use them.

Carfax has used BMC's Control-M data management tool for more than 15 years and helped to design and test a jobs-as-code interface added to the tool last year. Application developers use the jobs-as-code interface to describe and automate the process of data transfers among various systems in the environment before they deploy apps, which was a manual process that slowed DevOps.

Carfax had job schedulers on duty 24/7 to manually download files to the corporate network and notify database admins (DBAs) to load the data in those files into systems such as MySQL databases and Salesforce customer relationship management. Now, developers automate those workflows.

"Nobody needs to be staring at a screen, watching email and moving files anymore," said Robert Stinnett, automation engineer for Carfax. "Nowadays, I'm doing stuff that can help drive us forward as a company."

Stinnett is a good example of how Carfax changed its IT operations roles as DevOps processes and tools matured. When he started with Carfax 15 years ago, Stinnett's job description for the company's mainframe systems was to watch email, download files and notify DBAs.

Then, as Control-M automated data transfer processes in earlier versions, Stinnett designed those data workflows on developers' behalf. Now, he trains those developers to use Control-M jobs-as-code and directs them to Control-M's GitHub-based sandbox environment to design their own automated workflows. This frees him to explore more strategic work, such as a potential integration between on-premises Control-M, its cloud equivalent in the AWS Batch service and how to tie those services together with AWS Lambda.

Still, while BMC has modernized products such as Control-M the last two years, Stinnett acknowledged it doesn't have a hot reputation in bleeding-edge DevOps circles. He said he hopes BMC will create a SaaS version of Control-M to capture cloud users and cloud data, which will also come in handy as Carfax moves to AWS.

BMC confirmed it doesn't have a SaaS version of Control-M yet, but declined to comment on whether such an offering might be on the product's roadmap.

Dig Deeper on Systems automation and orchestration

Software Quality
App Architecture
Cloud Computing
SearchAWS
TheServerSide.com
Data Center
Close