News Stay informed about the latest enterprise technology news and product updates.

AI apps demand DevOps infrastructure automation

Businesses look to AI apps to gain a competitive edge, but IT teams must make rapid application changes through a DevOps process and embrace infrastructure automation.

Artificial intelligence can offer enterprises a significant competitive advantage for some strategic applications....

But enterprise IT shops will also require DevOps infrastructure automation to keep up with frequent iterations.

Most enterprise shops won't host AI apps in-house, but those that do will turn to sophisticated app-level automation techniques to manage IT infrastructure. And any enterprise that wants to inject AI into its apps will require rapid application development and deployment -- a process early practitioners call "DevOps on steroids."

"When you're developing your models, there's a rapid iteration process," said Michael Bishop, CTO of Alpha Vertex, a fintech startup in New York that specializes in AI data analysis of equities markets. "It's DevOps on steroids because you're trying to move quickly, and you may have thousands of features you're trying to factor in and explore."

DevOps principles of rapid iteration will be crucial to train AI algorithms and to make changes to applications based on the results of AI data analysis at Nationwide Mutual Insurance Co. The company, based in Columbus, Ohio, experiments with IBM's Watson AI system to predict whether new approaches to the market will help it sell more insurance policies and to analyze data collected from monitoring devices in customers' cars that help it set insurance rates.

"You've got to have APIs and microservices," said Carmen DeArdo, technology director responsible for Nationwide's software delivery pipeline. "You've got to deploy more frequently to respond to those feedback loops and the market."

DevOps infrastructure automation

This puts greater pressure on IT ops to provide developers and data scientists with self-service access to an automated infrastructure. Nationwide relies on ChatOps for self-service, as chatbots limit how much developers switch between different interfaces for application development and infrastructure troubleshooting. ChatOps also allows developers to correct application problems before they enter a production environment.

AI apps push the limits of DevOps infrastructure automation

Enterprise IT pros who support AI apps quickly find that no human can keep up with the required rapid pace of changes to infrastructure. Moreover, large organizations must deploy many different AI algorithms against their data sets to get a good return on investment, said Michael Dobrovolsky, executive director of the machine learning practice and global development at financial services giant Morgan Stanley in New York.

"The only way to make AI profitable from an enterprise point of view is to do it at scale; we're talking hundreds of models," Dobrovolsky said. "They all have different lifecycles and iteration [requirements], so you need a way to deploy it and monitor it all. And that is the biggest challenge right now."

Houghton Mifflin Harcourt, an educational book and software publisher based in Boston, has laid the groundwork for AI apps with infrastructure automation that pairs Apache Mesos for container orchestration with Apache Aurora, an open source utility that allows applications to automatically request infrastructure resources.

Long term, the goal is to put all the workload management in the apps themselves, so that they manage all the scheduling ... managing tasks in that way is the future.
Robert Allendirector of engineering, Houghton Mifflin Harcourt

"Long term, the goal is to put all the workload management in the apps themselves, so that they manage all the scheduling," said Robert Allen, director of engineering at Houghton Mifflin Harcourt. "I'm more interested in two-level scheduling [than container orchestration], and I believe managing tasks in that way is the future."

Analysts agreed application-driven infrastructure automation will be ideal to support AI apps.

"The infrastructure framework for this will be more and more automated, and the infrastructure will handle all the data preparation and ingestion, algorithm selection, containerization, and publishing of AI capabilities into different target environments," said James Kobielus, analyst with Wikibon.

Automated, end-to-end, continuous release cycles are a central focus for vendors, Kobielus said. Tools from companies such as Algorithmia can automate the selection of back-end hardware at the application level, as can services such as Amazon Web Services' (AWS) SageMaker. Some new infrastructure automation tools also provide governance features such as audit trails on the development of AI algorithms and the decisions they make, which will be crucial for large enterprises.

Early AI adopters favor containers and serverless tech

Until app-based automation becomes more common, companies that work with AI apps will turn to DevOps infrastructure automation based on containers and serverless technologies.

Veritone, which provides AI apps as a service to large customers such as CBS Radio, uses Iron Functions, now the basis for Oracle's Fn serverless product, to orchestrate containers. The company, based in Costa Mesa, Calif., evaluated Lambda a few years ago, but saw Iron Functions as a more suitable combination of functions as a service and containers. With Iron Functions, containers can process more than one event at a time, and functions can attach to a specific container, rather than exist simply as snippets of code.

"If you have apps like TensorFlow or things that require libraries, like [optical character recognition], where typically you have to use Tesseract and compile C libraries, you can't put that into functions AWS Lambda has," said Al Brown, senior vice president of engineering for Veritone. "You need a container that has the whole environment."

Veritone also prefers this approach to Kubernetes and Mesos, which focus on container orchestration only.

"I've used Kubernetes and Mesos, and they've provided a lot of the building blocks," Brown said. "But functions let developers focus on code and standards and scale it without having to worry about [infrastructure]."

Beth Pariseau is senior news writer for TechTarget's Cloud and DevOps Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Dig Deeper on IT Ops Implications of Continuous Delivery

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

How has DevOps infrastructure automation helped you support AI apps?
Cancel

-ADS BY GOOGLE

SearchDataCenter

SearchAWS

SearchServerVirtualization

SearchCloudApplications

TheServerSide.com

SearchCloudComputing

DevOpsAgenda

Close