Nmedia - Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Edge computing becomes 'Wild West' of IT world

Among other benefits, edge computing -- a critical component for many IoT applications -- can boost performance and enhance security. But it isn't without its challenges.

This article can also be found in the Premium Editorial Download: Modern Infrastructure: Container orchestration for a new breed of IT:

End users don't necessarily care where and how data is stored and processed. They just want to know the job will get done quickly, securely and with as few difficulties as possible. Oh, and inexpensive, too.

To meet those expectations, a major trend seems to be getting underway. After the devolution of the client-server era, when some questioned the need for a central IT function at all, the trend toward centralization has been predominant, abetted by the cloud. However, with so much data coming from or going to the edge, schemes are afoot to do more processing there, to extract value from the data sooner and to reduce the volume traveling over scarce network bandwidth.

Cisco, for one, is championing an awareness of "perishable" data, a concept that clarifies the evolution of edge computing. There are two main reasons why organizations may need to do more at the edge, according to Mike Flannagan, vice president and general manager of data and analytics at Cisco.

The first is that they don't have enough network bandwidth. He cites the example of an oil or gas company operating an offshore rig. The "downhole" sensors used to monitor progress can generate 10 terabytes of data per day per well.

"You are talking about tens to hundreds of terabytes a day, and since offshore sites are generally connected via satellite at around a megabyte per second, it would take a month to move a day's worth of data," he said. So, if they don't process the data closer to where it is generated, they don't really get the value of the data.

Such data is also, often, perishable, he notes. Should the drilling operators increase or decrease pressure in order to improve productivity? In this example, the data has its maximum value when it is freshest, allowing an operator to make an immediate operational decision. If it takes a week or a month to do that because of time delays in data transmission and analysis, the information is no longer useful.

Gaining the edge

From an industrial or corporate perspective, there can be huge advantages to edge computing, said Sylvain Fabre, research director at Gartner. Processing data locally means not only faster results, but also less movement of data, since only "results," rather than raw data, are likely to be sent to a central location.

Another angle is security. Processing data locally means it is kept in an internal environment, Fabre explains. "Any industrial environment where latency is a problem, not to mention things like drones and self-driving cars, will find value in an edge approach," he said.

And, from a mobile infrastructure standpoint, there has been definite interest in building more compute and storage capacity near the edge. "A couple of years ago Nokia started offering to put functionality in base stations near the users; there were some interesting use cases, but it ultimately proved to be quite expensive," Fabre said.

However, others said the Internet of Things (IoT) is definitely the big news when it comes to edge data.

"This is not really a new concept because IoT is using much of the same topological paradigm as industrial controls," said Christian Renaud, research director for IoT at 451 Research, based in New York. Locating compute resources "on the edge" was traditional in operational systems, "because you need local control, local action and faster response."

"If you look at the process control world, industrial automation, and energy sector, the legacy systems had already been using edge out of necessity," Renaud said. The difference today is the availability of cloud models and the almost infinite extent of IoT, ranging from fitness and factories to cars and healthcare. A good percentage of those actual or potential IoT functions require local compute and analysis for viability, he notes.

"I don't want a bad cable somewhere else to stop my assembly line from operating," he said. Additionally, bandwidth usage, security and data sovereignty have emerged as drivers for doing more locally. "If I have sensitive operational data from an electric utility substation, I don't want to risk transmitting that."

The primary arguments for edge computing are reduced cost and bandwidth, survivability -- because there is no single point of failure -- privacy and security, Renaud said.

"The arguments against are that we don't know if things like IoT gateways will be more expensive than centralized cloud functions, especially for applications that don't have demands such as ultra-low latency and strong privacy or security," he said.

Tension runs high

But the tension between central and edge computing will likely spawn new models, as well. For example, Renaud said, cloud providers might offer 10 sites in one rack for "near edge" computation. "If other industries, like VOIP, are an indication, we won't get just one approach or the other," he explained.

An example of a push at the paradigm comes from Ryft Systems in Rockville, MD, a provider of high-speed data analytics products that support edge computing. The focus is to deliver field-programmable gate array compute acceleration and x86 integration for IoT and video surveillance cameras.

"You typically hear about all the data at the edge that must be managed or gotten to the cloud or data center, but that ignores a lot of existing compute power already present at the edge," said Pat McGarry, vice president of engineering. Furthermore, what's really needed at the edge is analytics, and that's not something that the people or IT infrastructure at the edge can handle. "You need data scientists and more powerful machines -- or you need a different, hybrid approach," he said.

Others offered words of caution.

"Even with the promise of IoT, running remote applications on the network's edge -- to apply IoT analytics on site or streamline data flowing home -- might give CTOs pause," said Guru Chahal, head of product at Avi Networks, a provider of cloud application delivery platforms in Santa Clara, California.

Chahal warns that those applications could behave badly, and there are upgrades and maintenance to consider. Therefore, those looking to implement edge computing must devise plans for load balancing remote servers and monitoring application performance in conjunction with their overall computing strategies.

"Supporting multi-cloud deployments and delivering application services, such as load balancing and application analytics, closest to the actual applications is ever more important," he said.

In short, edge data has become the Wild West of IT -- with lots of potential for growth balanced with a lot of potential for confusion and chaos.

Alan Earls is a Boston-based freelance writer focused on business and technology.

Next Steps

Prepare your data center architecture for IoT

Future telecom networks to embrace the edge

Security tips for the edge and cloud computing

This was last published in May 2016

Dig Deeper on Real-Time Performance Monitoring and Management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchDataCenter

SearchAWS

SearchServerVirtualization

SearchCloudApplications

SearchCloudComputing

Close