Info Image

Edge Intelligence is Born

Edge Intelligence is Born Image Credit: blackboard/Bigstockphoto.com

In recent years, there has been a shift in how content delivery, IoT, and emerging information and operational technology applications impose stringent requirements on networks. Latency and bandwidth requirements of today’s applications demand further optimization of network transport and data processing at or close to end devices – The Edge.

Until recently, technologists have viewed cloud computing as the prevailing choice to control connected things and data while providing intelligence based on the data. Response time and network load are now proving too high for cloud-based solutions. Distributing intelligence at the edge stops the need for traffic data to travel deeper into the network. Sending data back to the cloud over a mobile network and waiting for a response, may cause unacceptable delays. Intelligence must be localized to work with the data as fast as it hits the network. Intelligence at the edge will decrease delays and increase the experience.

This brings up the question: How to define what or where the edge is. In some ways, it is a floating target. Some refer to the edge as the point where a service is produced and consumed or at the natural point between where data is handed to something or someone different. An IoT server crunching the data for all collection points has its edge in the actual server. An ISP collecting content data from a carrier often has its edge in a larger data center within a major city. Take the car industry as an example. A manufacturer needs to conduct test drives in winter conditions where it is winter and where in the off-season, winter conditions can be created at an affordable cost. To make the best analysis of a test drive within a short period of time and to re-test without losing too much time, the manufacturer will need excessive compute power very close to the test facility. It would be far too costly and time consuming to send this data back to the corporate headquarters or production plant. Instead, it would be ideal to locate the server very close to the test drive facilities, thus putting the edge just around the corner.

IoT, storage and when the edge is not the answer

There are examples where edge computing really is not the answer. For example, if you need to analyze vast amounts of data collected over a large geographical area, it is only when you have all this data in a single point that you can draw the conclusions needed. A centrally-located mega data center based on cheap, green power, stable climate and experienced data center experts would fit your needs perfectly.

The IoT trend is naturally also driving the edge further out from the main centers. For a farmer collecting loads of data from hundreds of devices checking the soil, the water quality and the weather that he/she needs at the farm, a server in a very close by edge data center would fit their needs. There would be no need to send this data long distances since the purpose is to use it at the farm. Now, if you add neighboring farms to the mix with the same need that want to compare data, suddenly there is a need to connect to more centrally located data centers.

Applications that have a natural storage need are another example that does not really need to be at the edge. Companies that keep records of transactions or store larger files of data should not consider the edge. Here, the need is again to find the most ideal data center location with a vast amount of cheap, green power combined with a good climate and experienced data center staff. Around the globe, there are places in the Nordics and some part of the Pacific Northwest in the U.S. that possess these qualities.

To ensure distributed intelligence, data center-style facilities should be closer to where the data is generated. “Data center-style” could be anything from a server inside a mobile base station to a complete data center with hundreds of racks and a vast amount of power. To connect all these data centers - now distributed further away from the normal hubs of the largest cities - to places where fewer people live is a challenge for an operator and especially an international carrier. Even the smallest data center at the edge of the network needs to be connected to the core to reach its full potential. What we are likely to see is a multi-faceted infrastructure where learning happens on the edge of the network and in the cloud. Reaping value from both edge and cloud, thus enabling distributed intelligence. This implies providing intelligence by ensuring information is closer to the devices. Therefore, a combination of edge and core is essential to be successful.

NEW REPORT:
Next-Gen DPI for ZTNA: Advanced Traffic Detection for Real-Time Identity and Context Awareness
Author

Mattias Fridström is Vice President and Chief Evangelist of Arelion. With over 20 years in the telecommunications industry, Mattias can be considered a veteran. Since joining Telia in 1996, he has worked in a number of senior roles within Telia Carrier (now Arelion) and most recently as CTO. He has been Arelion's Chief Evangelist since July 2016.

PREVIOUS POST

Overcoming Enterprises’ Top 5 Pain Points with Application-Aware SD-WAN

NEXT POST

Having Your (Layer) Cake and Eating It: The Challenge of 5G User Experience Management