BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
To "software define" the entire data center, big changes are taking place in the underlying hardware. Some of those changes may be too big for today's data centers.
A software-defined infrastructure includes chip-level virtualization accelerators, virtual storage accelerators, network packet accelerators and so on, said Shannon Poulin, vice president (VP) of Intel Corp.'s Datacenter and Connected Systems Group.
"The software-defined trend requires standardized hardware, but built for virtualization," he said. "We're used to this in servers. Now we need this same concept in storage and networks."
This includes "smart" hardware that integrates with the control software throughout the data center, and a renewed focus on the CPU. More network control will take place at the CPU, said Zeev Draer, VP of strategic marketing at networking equipment supplier MRV Communications Inc., based in Chatsworth, Calif.
As processing moves up the network stack, CPU resources will be in demand to inspect network traffic and make decisions. Network controllers will enable centralization and flexibility with powerful CPU clusters underlying the software-defined infrastructure, instead of locally aware and isolated, power-intensive network elements, Draer said.
Other hardware components, such as network data path interfaces, become dumb with software-defined networking (SDN) and could become more affordable.
"The value in SDN and network virtualization is at the control layer," said Lori MacVittie, senior product manager of emerging technologies at Seattle-based F5 Networks Inc., which makes application delivery networking technology.
Software-defined data centers enable IT control
"While compute needs are growing, infrastructure [today] is basically static," Poulin said -- even with virtualization and cloud options. Users tell IT what they need, IT wires it all up, and the resources sit there -- unmonitored.
"We need to police the overprovisioning," Poulin said.
Software-defined data centers (SDDCs) include orchestration software for on-the-fly provisioning and management.
With server virtualization, the CPU and software issue commands to reconfigure the server, create virtual machines, and adjust capacity and utilization. Apply this to the multi-layer network and networking devices will understand the southbound protocol language and accept instructions on the fly for specific layers, or as multilayer gateways that can scope all variance flows from physical layer to IP layer, Draer explained.
"The IT team might not make any changes at first. But eventually, with the ability to adjust provisioning, you'll figure out how to use your resources in a better way," Poulin said.
But it isn't just about the technical performance payoff.
IT directors need to explain the efficiency and operational value of investing in a hardware and software stack tailored for a software-defined infrastructure. Unifying network and storage with compute creates a lot of value, Poulin noted.
"[You can] reconfigure the infrastructure at will between workloads. Configure for a compute-heavy project, then reconfigure for a project that uses high bandwidth without ever having to change the data center resources," he said.
Don't just spout off hardware specifications for decision makers, he said; demonstrate the concept.
Who's buying into software-defined data centers?
It isn't just the managed service providers that should invest in software-defined infrastructure.
"If you're running a data center today, you're big enough for a software-defined data center," said Poulin, who will speak on SDDCs at the Data Center World Fall 2013 conference in Florida. "If you have one rack or a small closet, you're more likely looking for a way to not run a data center anymore, and outsourcing is attractive."
But software-defined infrastructure is in its infancy, and will likely evolve much like virtualization once did. Companies will try it out on one or two racks running a low-priority app, then launch new projects once benefits are proven, Poulin said.
Companies that are still working toward full virtualization aren't ready to make everything software-based just yet.
"We keep it on the table as an option to include on our three- to five-year roadmap, but I don't see value in software-defined data centers today," said Dave Chivers, VP and chief information officer of VSE Corporation, a technical services company in Alexandria, Va. that works with government agencies and contractors.
Lack of interest in SDDC is common among IT pros who don't see a real need for it yet.
"[Software-defined] seems like a solution to a problem that doesn't exist," said Doug Feltman, director of systems and applications at New York-based 24 Seven Inc., a staffing company.
Feltman operates a mix of physical and virtualized servers in the company's New York and California data centers. Some applications -- such as Office365 -- fit naturally on the cloud, while other things should remain firmly planted on hardware. With that, Feltman may consider a hybrid cloud approach over SDDC.