Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

To work right, IoT has to move to the cloud's edge

David Linthicum | May 13, 2016
Traditional cloud architecture has too much latency for many IoT applications. Welcome to the edge of the cloud

This vendor-written piece has been edited by Executive Networks Media to eliminate product promotion, but readers should note it will likely favour the submitter's approach.

I attended Internet of Things World in Santa Clara, Calif., this week and served as the track chairman for, you guessed it, cloud and IoT. The vibe I got throughout the event was one of confusion: IoT seems to be so systemic, yet is difficult to define. As one presenter put it, "It's like plastic. It's going to be a part of everything."

In the context of the Internet of things, the trouble with the cloud is that data needs to be sent back from the sensors gathering info, such as a Nest thermostat or a Fitbit, to a database in a remote public cloud. The time that it takes for the data to be transferred from the device or sensor to the remote public cloud -- that is, the latency -- is often too much to meet the requirements of the IoT system.

We need to do something different -- and we can start by doing IoT applications at the cloud's edge. This means that we avoid sending all data from sensors and devices back to the cloud, but instead build data and applications on the edge of the network that can handle most of the data gathering and processing.

The benefit is better performance and efficiency. IoT application need to react almost instantly to the data generated by a sensor or device, such as stopping a train if sensors have reported problems with the track switch a few miles ahead or shutting down an industrial machine that is about to overheat and explode. There are hundreds of use cases where reaction time is the key value of the IoT system.

Of course, we have to give this a name. Cisco Systems has tried to brand it "fog computing" and set up the OpenFog Consortium to promote its view. Whatever it ends up being called and defined, the key is reducing latency for response-critical applications by moving the data transfer and processing to the edge of the cloud, closer to the IoT device.

Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing. I've been involved in dozens of systems where the data and application were placed near the source, yet working with centralized data and applications.

But in the Internet of things, the latency issue is more acute and more widespread than it is for other kinds of computing. That's why putting IoT at the edge of the cloud is such an important concept.

Source: Infoworld 


Sign up for MIS Asia eNewsletters.