The Foggy Future of IoT
Data Center Maintenance
At this point, the industry is sure about IoT—it’s here and it’s expanding. Nonetheless, given the need to solve the data volume transmission problem, the IoT future is definitely foggy.
No sooner have we gotten our minds around IoT but now there are innovations transforming how this hyper-connected world will be implemented. Among the emerging technologies is fog computing, which moves computing closer to the end-user to decrease latency and bandwidth overload while improving access.
The fog network architecture taps client and/or edge devices to handle much of the storage, communication, control, configuration, measurement, and management demands of IoT. As such, fog computing peels away from central cloud storage, backbone networks, and network gateways (e.g., LTE).
Fog is inherently decentralized, distributing data, compute, storage, and analysis cycles wherever it is most efficient between the data source and the cloud, so less volume and more refined data can be transmitted.
Fog computing structures that may be familiar include smart grids, smart buildings, and micro-city initiatives. Most instructive may be vehicle networks, where any latency is deadly. Software-defined networks (SDN) are another example.
The OpenFog Consortium dates back to November 2015, when members from Cisco, Microsoft, ARM, Intel, Dell, and Princeton University assembled to state a mission, develop a standard architecture, and begin communicating the business value of fog. There are also telco interests getting involved and some networks in operation.
Some Clarity on How Fog Works
So how is the whole thing put together?
Think of two endpoints. First, edge devices and sensors are located where the data is generated, but they are dumb. If you believe the IDC study, 10% of the world’s data will be generated at the edge by 2020. Thats a lot of data to be transmittedand increasing efficiency is driving the fog computing revolution.
Second, of course, are the cloud servers, which have analytics and storage capabilities built in, but are far away from the data-generating sensors. Transmitting data between the two takes time, uses bandwidth, and can have serious privacy and security implications.
Fogging moves data processing to the smart device, router, or gateway. This cuts down the amount of data sent to the cloud and often transforms it, at least to some degree, before further transmission.
Fog computing is not a cloud replacement, but rather an add-on that provides short-term data consolidation and analytics before the cloud takes over for the resource-intensive stuff.
Those who hear echoes of edge computing are absolutely right. The two are conceptually similar. The only difference is where the computing power is placed.
Pros and Cons
There are many benefits to fog computing in the emerging IoT landscape. The concept is vendor agnostic and natively cloud-oriented. Additionally, it drives:
- Reductions in data volumes sent to the cloud
- Network resource savings
- Latency decreases
- Improved data access times
- Enhanced security
As with any technology, fogging has its challenges as well:
- Fog devices are new. There simply aren’t a lot of them on the market yet, and the quality has not caught up with the power of the concept.
- Any fog device physically exists on the LAN and has to be managed. A plethora of them can make for network administration headaches.
- Security isnt wholly buttoned up. Fog computing is vulnerable to man-in-the-middle attacks and remote authentication schemes, not to mention new physical security problems caused by its distributed nature.
At this point, the industry is sure about IoTit’s here and it’s expanding. Nonetheless, given the need to solve the data volume transmission problem, the IoT future is definitely foggy.