Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlinesaround this subject taking advantage of the Hype of the IoT.
I hope this post help you better understand what is the role of Fog Computing in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the "Things" to the Cloud through some applications areas and examples of Fog Computing.
The problem with the cloud
As the Internet of Things proliferates, businesses face a growing need to analyze data from sources at the edge of a network, whether mobile phones, gateways, or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.
The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources are mostly being provided by cloud service providers, where the computation and storage capacity exists.
However, despite its power, the cloud model is not applicable to environments where operations are time-critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents can’t afford the latency caused by the roundtrip to the cloud server.
“The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes.”
Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, especially when dealing with sensitive data that is subject to separate regulations in different countries.
IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time.
The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.
The variety of IoT systems and the need for flexible solutions that respond to real-time events quickly make Fog Computing a compelling option.
The Fog Computing, Oh my good another layer in IoT!
A study by IDC estimates that by 2020, 10 percent of the world’s data will be produced by edge devices. This will further drive the need for more efficient fog computing solutions that provide low latency and holistic intelligence simultaneously.
“Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing.”
The Fog Computing or Edge Computing is a paradigm championed by some of the biggest IoT technology players, including Cisco, IBM, and Dell and represents a shift in architecture in which intelligence is pushed from the cloud to the edge, localizing certain kinds of analysis and decision-making.
Fog Computing enables quicker response times, unencumbered by network latency, as well as reduced traffic, selectively relaying the appropriate data to the cloud.
The concept of Fog Computing attempts to transcend some of these physical limitations. With Fog Computing processing happens on nodes physically closer to where the data is originally collected instead of sending vast amounts of IoT data to the cloud.
Photo Source: http://electronicdesign.com/site-files/electronicdesign.com/files/uploads/2014/06/113191_fig4sm-cisco-fog-computing.jpg
The OpenFog Consortium
The OpenFog Consortium, was founded on the premise based on open architectures and standards that are essential for the success of a ubiquitous Fog Computing ecosystem.
The collaboration among tech giants such as ARM, Cisco, Dell, GE, Intel, Microsoft and Schneider Electric defining an Open, Interoperable Fog Computing Architecture is without any doubt good news for a vibrant supplier ecosystem.
The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest the edge of the network when dictated by business concerns or critical application the functional requirements of the system.
The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.
Benefits of Fog Computing
· Frees up network capacity - Fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
· It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
· Boosts data security - Collected data is more secure when it doesn't travel. Also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.
Disadvantages of Fog Computing
· Analytics is done locally- Fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
· Some companies don't like their data being out of their premises- with Fog Computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
· Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.
Read more: http://bigdata.sys-con.com/node/3809885
Examples of Fog Computing
The applications of fog computing are many, and it is powering crucial parts of IoT ecosystems, especially in industrial environments. See below some use cases and examples.
· Thanks to the power of fog computing, New York-based renewable energy company Envision has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
· Plat One is another firm using fog computing to improve data processing for the more than 1 million sensors it manages. The company uses the Cisco-ParStream platform to publish real-time sensor measurements for hundreds of thousands of devices, including smart lighting and parking, port and transportation management and a network of 50,000 coffee machines.
· In Palo Alto, California, a $3 million project will enable traffic lights to integrate with connected vehicles, hopefully creating a future in which people won’t be waiting in their cars at empty intersections for no reason.
· In transportation, it’s helping semi-autonomous cars assist drivers in avoiding distraction and veering off the road by providing real-time analytics and decisions on driving patterns.
· It also can help reduce the transfer of gigantic volumes of audio and video recordings generated by police dashboard and video cameras. Cameras equipped with edge computing capabilities could analyze video feeds in real time and only send relevant data to the cloud when necessary.
See more at: Why Edge Computing Is Here to Stay: Five Use Cases By Patrick McGarry
What is the future of fog computing?
The current trend shows that fog computing will continue to grow in usage and importance as the Internet of Things expands and conquers new grounds. With inexpensive, low-power processing and storage becoming more available, we can expect computation to move even closer to the edge and become ingrained in the same devices that are generating the data, creating even greater possibilities for inter-device intelligence and interactions. Sensors that only log data might one day become a thing of the past.
Janakiram MSV wondered if Fog Computing will be the Next Big Thing In Internet of Things? . It seems obvious that while cloud is a perfect match for the Internet of Things, we have other scenarios and IoT solutions that demand low-latency ingestion and immediate processing of data where Fog Computing is the answer.
Does the fog eliminate the cloud?
Fog computing improves efficiency and reduces the amount of data that needs to be sent to the cloud for processing. But it’s here to complement the cloud, not replace it.
The cloud will continue to have a pertinent role in the IoT cycle. In fact, with fog computing shouldering the burden of short-term analytics at the edge, cloud resources will be freed to take on the heavier tasks, especially where the analysis of historical data and large datasets is concerned. Insights obtained by the cloud can help update and tweak policies and functionality at the fog layer.
And there are still many cases where the centralized, highly efficient computing infrastructure of the cloud will outperform decentralized systems in performance, scalability and costs. This includes environments where data needs to be analyzed from largely dispersed sources.
“It is the combination of fog and cloud computing that will accelerate the adoption of IoT, especially for the enterprise.”
In essence, Fog Computing allows for big data to be processed locally, or at least in closer proximity to the systems that rely on it. Newer machines could incorporate more powerful microprocessors, and interact more fluidly with other machines on the edge of the network. While fog isn’t a replacement for cloud architecture, it is a necessary step forward that will facilitate the advancement of IoT, as more industries and businesses adopt emerging technologies.
'The Cloud' is not Over
Fog computing is far from a panacea. One of the immediate costs associated with this method pertains to equipping end devices with the necessary hardware to perform calculations remotely and independent of centralized data centers. Some vendors, however, are in the process of perfecting technologies for that purpose. The tradeoff is that by investing in such solutions immediately, organizations will avoid frequently updating their infrastructure and networks to deal with ever increasing data amounts as the IoT expands.
There are certain data types and use cases that actually benefit from centralized models. Data that carries the utmost security concerns, for example, will require the secure advantages of a centralized approach or one that continues to rely solely on physical infrastructure.
Though the benefits of Fog Computing are undeniable, the Cloud has a secure future in IoT for most companies with less time-sensitive computing needs and for analysing all the data gathered by IoT sensors.