News Stay informed about the latest enterprise technology news and product updates.

SDN, hyperscale data center technologies to impact strategies in 2014

Software-defined everything, real-world implementations of converged infrastructure and a hybrid cloud strategy should be on your agenda in 2014.

IT pros spend a lot of time fighting fires, but they should focus on strategic improvements as well. With that...

in mind, experts shared their predictions for data center technologies to watch, helping set the strategic agenda for your data center in 2014.

Total IT spending will reach $2.14 trillion in 2014, according to Framingham, Mass.-based analyst firm IDC, at its annual Worldwide Enterprise Server event in December. Spending on servers worldwide will top $53 billion, up 2% from 2013. Intel-based servers will drive much of that growth, with an expected $40 billion or more in sales, IDC predicted.

"[The] x86 [market] is where most of the growth is going and a lot of it is going to be driven by Intel's Ivy Bridge refresh, which will take hold [in 2014]," said Jed Scaramella, research manager of servers at IDC.

Cloud growth

The transformation to the third platform -- comprised of a convergence of cloud, big data, social business and mobile technologies -- is happening faster than anything else in this industry, noted IDC analysts.

"A lot of that is being driven by consumer behavior and the world of mobile, and what's both being enabled via the cloud and also what it's driving via the cloud," said Matt Eastwood, group vice president and general manager of enterprise platforms at IDC.

However, despite the rapid rise of third-platform IT, the first and second platforms, characterized by mainframe/terminal and client-server systems respectively, will be key industry drivers for years to come, IDC predicts. Some 60% of server units and 75% of server revenue will be driven by first and second platform workloads in 2014.

Enterprises can defer long-term capital expenditures with hybrid cloud models, such as moving some workloads to cloud to free up resources for on-premises workloads, said David Cappuccio, managing VP and chief of research on infrastructure at Stamford, Conn.-based Gartner Inc., speaking during the Gartner Data Center Summit 2013 in Las Vegas.

"For many [enterprises], 2014 will involve an owned data center and/or colocation facilities working alongside ... [Infrastructure, Platform and Software as a Service] markets," said Clive Longbottom, co-founder and service director at analyst firm Quocirca, based in the U.K.

Making infrastructure, platform and software as a service work with the traditional data center resources means understanding all the dependencies in the chain, Longbottom added, so expect higher interest in data center infrastructure management tools.

Eastwood added that the major challenge for all corporate IT shops is how to best manage through this transition and make strategic bets on the future, while not dramatically moving away from legacy systems built up over the past few decades.

The hyperscale IT effect

"The third platform itself is driving a massive amount of investment in new Web-scale or hyperscale data centers that power cloud, mobile, social and analytic type workloads," Eastwood said. "We know companies such as Google [Inc.], [, Inc.], Microsoft and Facebook are literally spending billions of dollars on new data centers and server infrastructures to power their workloads in these data centers."

Server types, variety and complexity are going up, due to pressure from these hyperscale IT companies, said Carl Claunch, VP and distinguished analyst at Gartner Inc. Since server vendors have to meet the needs of both hyperscale companies and enterprises, there will be a bevy of choices available on new servers.

For example, there's a need for low-power servers in higher-density data centers popularized by Web-scale IT companies. One of those low-power options relies on system-on-a-chip (SoC). These SoCs will be implemented in some, if not all, designs, including integrating networking controllers, storage controllers, co-processors and memory.

Even though most enterprises don't operate like Facebook or Amazon, they are going to take tips and cues from hyperscale data centers, whether that's buying custom servers, replacing or repairing equipment faster, designing around a recovery time objective or just letting go of the past, summarized Gartner's Cappuccio.

Software-defined everything

The software-defined everything (SDx) movement will go from lab concept to real data center technology in 2014, said SearchDataCenter contributor Pete Sclafani, CIO of 6connect Inc.

"Virtualization of the network stack is a long time coming. With the increasing validation of the [return on investment], enterprises and service providers are going to start seeing how these provisioning tools can tie into their automation efforts," Sclafani said.

SDx will continue to aggregate around "the software defined data center" concept, but the "theoretical purity of the approaches will be undermined by vendors competing at a functional level and breaking the standards," said Longbottom.

Converged infrastructure adoption

Converged infrastructure is becoming a standard platform for early adopters and it may go mainstream in 2014.

"In this past year, people who said they were very likely to adopt an integrated system in the next three years jumped up 38%," said IDC's Scaramella. "It really went from an emerging concept of customers still evaluating it to a jump in the number of customers saying, 'Yeah, this is definitely something we're going to consider in the next two years.'"

These prepackaged, integrated systems are differentiated by software and how well the hardware performs as a unit, said Gartner's Claunch. They can be tailored to support a specific application to the best degree possible, but adaptability is limited and not everyone should assume converged infrastructure systems, like Cisco's Unified Computing System (UCS), will benefit their IT operations. You can end up with islands in the data center, counter to what virtualization has achieved, where workloads move around interconnected hosts and resources, Claunch said. And they require synchronized budgets for network, compute and the other elements of a data center deployment -- a rare practice today.

"You're throwing away a lot of existing investments to adopt integrated systems. Maybe the servers are ready for a refresh, but your network was just upgraded last year, and storage is two years old and not depreciated yet," he explained.

VCE Vblocks, IBM PureFlex, Cisco UCS, Dell VRTX and other converged products can simplify the data center in regards to power distribution, cabling and cooling, Quocirca's Longbottom said, but the cost of retrofitting older facilities to take advantage of these savings can be prohibitive.

Data, data and more data

Big data analytics will be huge in 2014. The digital universe will nearly double to six exabytes in 2014. While big data is largely created on the edge, more and more data is moving into the data center as well.

Big data also means big storage. And the notion of cheap storage is shifting, according to Sclafani, especially when it comes to solid state drives (SSDs).

"The focus has moved to storage speed, namely how SSDs can be deployed in various ways," Sclafani said. "Hybrid local storage options are interesting, but it's really giving storage admins more budget-friendly solutions to tackle speed versus just adding spindles."

Where in the past, storage admins had to be very selective about deploying SSDs, the normalization of SSD costs means that it will become a should-have data center option.

Ed Scannell, Meredith Courtemanche and Michael Anderson contributed to this report.

Dig Deeper on Scripting, Scheduling and IT Orchestration

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

On software-defined everything, we should be a bit careful with how we talk about standards. The space is emerging still, and that means that meaningful standards are not yet complete. The OpenFlow work has been fantastic, but OpenFlow is just a part of the broader SDN framework that people ought to be watching.

What is more interesting than if an SDO has completed a draft of something is whether the open source communities are driving meaningful code. OpenDaylight, as an example, should release code in January or February. That code will deliver a bunch of stuff. Whether or not it is standard is less important than whether it is useful.

Standards ought to lag adoption. Because SDN is so early, we ought not ask for standards too quickly. We still need to iterate on the technologies and architectures some.

Mike Bushong (@mbushong)
Data center design currently a best and hot cake business here in Africa, and eager to be a knowledgeable or certified for the data center implementations.
- Cooling Effects
- Fiber Optics cable vs UTP cable applicatoins
- Library system on the universities.
So, please let us work together and simple to be rich on Data center applications.
Wallelegn Taye, Ethiopia