There are plenty of technologies touted as the next big thing. Big data, flash, high-performance computing, in-memory processing, NoSQL, virtualization, convergence, software-defined whatever all represent wild new forces that could bring real disruption but big opportunities to your local data center.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
As a senior analyst at Taneja Group, I will discuss what we learn in ongoing IT industry research. We specialize in analyzing disruptive new technologies -- figuring out the opportunities they present and identifying what will actually work in practice.
We must get past the marketing hype and weed through conflicting messages from competing vendors. In their enthusiasm, brash startups can make grossly exaggerated claims, while larger incumbents might introduce fear, uncertainty and doubt when faced with new competition. How do you decide what will be the next big thing in technology?
Ripe for disruption
New products promise a compelling increase in performance, efficiency, productivity or end results. Sometimes these improvements justify an immediate rip and replace, but it's more likely that a careful evolutionary approach is warranted. For example, big data presents a potentially disruptive opportunity. The amount of interesting and available data is growing fast. Our competitive natures make us want to mine all the value out of it as quickly as we can. In response, a multitude of emerging infrastructure systems offers to help us cruise through these floods of data. It can be hard to know where to look first.
In this column, I explore new approaches to big data inspired by advances in high-performance computing, Web-scale applications and supercomputing architecture, including new forms of distributed storage, in-memory processing, cross-cloud workflows and scale-out processing platforms like Hadoop.
Bigger is badder, but in a good way
Apache Hadoop has been holding out great promise for several years, but many non-Web 2.0 companies have been disappointed or are still waiting for their big data "killer app"--one that doesn't require fielding an army of expensive data scientists.
However, a large crop of new analytical applications could make it easier for spreadsheet-focused business analysts to directly mine big data sets. The implication for the common data center is that there will soon be a much wider business-side demand for big data analysis platforms..
SQL is not enough
We are also carefully watching the NoSQL movement, using its broadest interpretation as "new or not only SQL." Traditional relational database management systems (RDBMS) have inherent constraints that some enterprising open source communities decided were stifling innovation.
By relaxing certain requirements, like guaranteeing an immediately consistent view of data to all clients, NoSQL solutions can instead deliver higher availability, scalability and distributed performance.
NoSQL isn't going to outright replace the RDBMS, but IT will be expected to host and manage various mixes of database types as developers build out layered applications.
One of the big data center trends we're covering is infrastructure convergence, especially hyperconvergence, where servers, storage and networking are collapsed into Lego-like building blocks. But do they really work?
Converged solutions simplify operational expenditures while promising to reduce wasted capital expenses. The best convergence stories tell us how to use the tight integration among components to minimize processing overhead, optimize resource capacity, and maximize availability and performance.
Converged solutions tend to overlap with acceleration technologies, when flash is added to servers to act as both I/O cache and persistent storage tier. I'll be examining new types of infrastructure convergence and hot acceleration technologies to see how each might best improve your data center's performance and total cost of ownership.
Does 'software-defined' mean anything?
The next big thing could also be a better version of the last big thing. Virtualization and the cloud still have a few tricks up their sleeves. Platform as a Service (PaaS) is still in its relative infancy, while application blueprints and cloud application hypervisors are just emerging. We all see hybrid opportunities down the road, but when will they get here, and what will they look like?
And just what does 'software-defined' mean? Are they delivering greater agility, tighter integration and more intelligent operations, or just the same old set of disparate application programming interfaces under a new name? I have some definite opinions to help understand what's really going on as more and more vendors claim that their management products are software-defined.
The next really big thing
Is the next big thing going to be like opening Pandora's Box, or will it transport your company to the next level? The worst thing you can do is put your head down and wait for something to happen. It's critical to keep up with, and hopefully ahead of, what might be the next big thing in technology.
And the proverbial door here is always open. If you see the shadow of something large on the horizon, let me know, and maybe we can help shine some light on it.