Edge analytics – the process of analyzing information directly on the network edge before relaying the throughput back to a server – is quickly changing the model of how the internet of things (IoT) functions.
Traditionally, data gleaned from remote devices on IoT-powered "smart" networks has been sent directly to either a cloud-based or private server.
If efforts are made to automatically analyze the information received before passing it further along the technology stack (for example, by flagging urgent readings or filtering out information determined to be extraneous), this has normally been done through specialized "big data" programs operating on a central server.
Low-power and data pose a unique conundrum
Almost since big data started making its mark on the IoT, however, it has been recognized that the process of indiscriminately sending data from the edge for analysis, without any kind of filtering in situ, could be improved upon from a technical perspective.
The reason why is as follows.
In the IoT – and particularly in its industrial application, the industrial internet of things (IIoT) – low-power consumption is a non-negotiable device requirement.
Devices in remote locations need to operate for as long as possible using their power sources – often simple lithium batteries – to justify depending on them for network monitoring rather than alternative methods of taking readings, such as sending a crew to the site. And as every telecommunications engineer knows, the most reliable way to quickly reduce the radio equipment's power overhead is to minimize their transmission frequency.
The problem is that doing so risks defeating the very reason that these devices are installed in the first place.
Without relaying information on a regular basis, crucial, time-sensitive insights (such as overloading pressure in a water main) could be learned of only by the time it is too late – or worse, missed entirely.
A second, emergent risk of unchecked data generation is the possibility that the information collected from IoT devices could mushroom at a faster pace than the cloud computing infrastructure required to process and store it.
This would result in IoT data saturating existing storage centers and expanding faster than new servers could be provisioned to store it. This would be to the detriment of conventional internet traffic, in addition to device-generated information.
The solution: analyze directly on the network edge
The nascent but fast-developing area of edge analytics proposes an elegant solution to the low-power data conundrum.
The ongoing evolution of microprocessing technology has meant that sufficient computing power to perform the heavy lifting of data analysis can now be performed offline, directly onboard the devices themselves (or on a nearby IoT gateway, via local nodes) rather than on a central server, whether in a public or private cloud environment.
Even compact monitoring kits barely larger than a cellphone, located dozens or hundreds of miles beyond the network perimeter at the true network edge, are now capable of carrying out this kind of computing. In addition, the devices are even becoming capable of utilizing sophisticated machine learning techniques to smartly interpret captured information.
For many use cases, conducting the analysis in this manner is also preferable to the latency concerns associated with streaming large and ongoing data flows from the network edge all at once.
IoT devices can take readings and analyze measured data’s importance through sophisticated algorithmic logic all while disconnected from the internet. This allows them to take autonomous decisions about when to transmit information, and to be parsimonious in their reporting of it, without risking missing crucial information in the process.
The data overload problem can also be solved
The benefits of edge analysis extend far beyond the obvious power savings gained from having devices go "live" only when something of importance to network administrators is noted.
Data overload is emerging as a significant challenge for the IoT, and minimizing the information devices report upon is therefore regarded as an important value-add by utility network administrators.
To illustrate the scale of the data overload problem as it currently stands, consider the following: by 2013, IBM estimated that 90% of the data in existence had been generated over the past two years alone. However, the total volume of created data is expected to double further every two years for the next decade (Quartz).
Oft-cited industry forecasts predict that 50-billion devices will be connected to the IoT by 2020. The rate of data collection will therefore continue to outstrip humans’ ability to analyze it for the foreseeable future. The key to solving this difficulty is to be more selective in what data is gathered and how much of it is sent for final interpretation by human eyes and ears.
Edge analytics can function as the vital gatekeeper in the network edge-human relay by providing the first layer of defense on the network against an avalanche of irrelevant information that could risk making it otherwise impossible for human administrators to identify the needle of relevant data from the haystack of network noise.
The systems, once implemented, are also not static. Solutions currently on the market and in current deployment are highly configurable, allowing administrators to carefully set and tweak parameters over time as monitoring requirements and network conditions evolve.
What’s more exciting is the fact that cloud and edge analytics can also be deployed in parallel, resulting in synergies for the cloud-hosted part of the configuration (as only potentially useful information is sent for further parsing by cloud-based programs, this software can operate more leanly and with greater efficiency).
Perhaps unsurprisingly in light of the above, customer calls for edge analytics support are growing almost by the day, and industry is proving ready to respond – one estimate predicts that 40% of IoT devices will support edge analytics within three years.
Who can benefit from edge analytics
Traditionally, industrial infrastructure monitoring has resulted in the transmission of excessive amounts of data from "inside the fence" assets (such as a power generation plant) and patchy, insufficient input from its important, remote component – the network edge (such as a transformer).
Because data taken from mostly inside the network perimeter originates centrally, where management also tends to cluster, statistics taken from such observation points have traditionally been regarded as the preserve of the "man in the tower". Otherwise put, the information has been primarily for the consumption of a utility’s executive leadership concerned with taking high-level, strategic decisions about its health.
While that situation resulted in incomplete awareness of system status, the unwieldy flood of information that unfiltered data from the field results in without the addition of edge analytics has threatened to emerge as a pyrrhic victory of sorts over the initial deficiencies remote monitoring entailed.
By combining forces, edge and cloud analytics can take back remote monitoring and make it an asset for those most in need of reliable output from the network.
These are those in the field and directly responsible for the optimization and maintenance of the network. These eyes and ears need to be able to view edge analytics to take tactical decisions about the network for immediate action – not to compile the data for inclusion in a monthly, network efficiency report.
This use of pre-analyzed remote monitoring data to guide short-term network decisions can be termed "tactical edge analytics" and will emerge as a key driver of interest in the technology as its implementation becomes more widespread.
What challenges does remote monitoring face?
Like any emerging development in technology, of course, the sailing for edge analytics cannot be expected to be entirely smooth – at least in its initial years.
The primary challenge facing the viability of edge analytics are constraints related to the very lightweight nature of IoT nodes and gateways in addition to their limited query processing power.
This is particularly true for components aboard low-power devices whose capabilities are a fraction of what almost centralized supercomputers are capable of parsing.
Manufacturers are reacting quickly to address this challenge by developing dedicated hardware specifically designed for edge applications’ unique operating needs. Such hardware will be capable of performing advanced operations and will be future-proof for at least a decade.
The growth of edge analytics is a vital force in ensuring that IoT networks complement, rather than overwhelm, the sum of knowledge available to infrastructure monitoring decision-makers.
They are also an important means of reducing the overall bandwidth footprint of edge networks, allowing them to continue to scale without exceeding networks’ ability to store their data in the process of doing so.
As implementation gathers pace, tactical edge analytics will greatly change how data from the edge is processed by allowing those in the field to directly act upon its insights to optimize network conditions.
For that reason and more, estimates as to the growth potential of edge analytics are possibly conservative.
At the very least, those in the IoT space can expect the field to continue to drive disruptive changes in the industry for many years to come.
— Ariel Stern, CEO, Ayyeka
Analytics is a key topic on the agenda at Internet of Things World in Santa Clara this May. Sample our speakers, preview the agenda, claim your free expo pass or book your place at the conference for the world's biggest IoT event!