BACKGROUND IMAGE: ktsimage/GettyImages
Weigh IT infrastructure costs in your DevOps decision
Who wouldn't want the benefits that the DevOps methodology offers? Blending the development and operations teams into a smooth conveyor belt of applications is ideal in our modern, speed-obsessed culture.
Yes, it requires a culture change in interdepartmental communication. But DevOps adoption demands more than just getting the developers and ops leaders on the same page. DevOps means more automation, and that always has consequences on long-standing elements of the business. A business that embraces DevOps must consider how that will affect IT infrastructure costs.
IT impacts a business on multiple levels, and increasing one aspect of it without putting additional resources into the others creates an imbalance that can ruin any chance for success. Just as programs and code must work together to create a seamless application, the same is true with DevOps and infrastructure.
In this handbook, IT expert Brian Kirsch explains what administrators need to think about from an infrastructure perspective when the business is considering DevOps adoption. Legacy infrastructure often can't keep up with DevOps' fast-paced nature, and it will make sense to upgrade or replace that hardware. But does the cost of those improvements offset the savings that come from DevOps? The answer is different for every business. It's often a question of capital vs. operating expenses, and businesses need to rethink IT infrastructure costs.
DevOps is part of the push to be continually faster. Hardware will always be part of the equation, though, and admins shouldn't keep it out of mind.