Fog computing


Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the internet backbone.

Concept

Fog computing can be perceived both in large cloud systems and big data structures, making reference to the growing difficulties in accessing information objectively. This results in a lack of quality of the obtained content. The effects of fog computing on cloud computing and big data systems may vary. However, a common aspect is a limitation in accurate content distribution, an issue that has been tackled with the creation of metrics that attempt to improve accuracy.
Fog networking consists of a control plane and a data plane. For example, on the data plane, fog computing enables computing services to reside at the edge of the network as opposed to servers in a data-center. Compared to cloud computing, fog computing emphasizes proximity to end-users and client objectives, dense geographical distribution and context-awareness, latency reduction and backbone bandwidth savings to achieve better quality of service and edge analytics/stream mining, resulting in superior user-experience and redundancy in case of failure while it is also able to be used in Assisted Living scenarios.
Fog networking supports the Internet of Things concept, in which most of the devices used by humans on a daily basis will be connected to each other. Examples include phones, wearable health monitoring devices, connected vehicle and augmented reality using devices such as the Google Glass.
SPAWAR, a division of the US Navy, is prototyping and testing a scalable, secure Disruption Tolerant Mesh Network to protect strategic military assets, both stationary and mobile. Machine-control applications, running on the mesh nodes, "take over", when internet connectivity is lost. Use cases include Internet of Things e.g. smart drone swarms.
ISO/IEC 20248 provides a method whereby the data of objects identified by edge computing using Automated Identification Data Carriers , a barcode and/or RFID tag, can be read, interpreted, verified and made available into the "Fog" and on the "Edge," even when the AIDC tag has moved on.

History

In 2011, the need to extend cloud computing with fog computing emerged, in order to cope with huge number of IoT devices and big data volumes for real-time low-latency applications.
On November 19, 2015, Cisco Systems, ARM Holdings, Dell, Intel, Microsoft, and Princeton University, founded the OpenFog Consortium to promote interests and development in fog computing. Cisco Sr. Managing-Director Helder Antunes became the consortium's first chairman and Intel's Chief IoT Strategist Jeff Fedders became its first president.

Definition

Both cloud computing and fog computing provide storage, applications, and data to end-users. However, fog computing is closer to end-users and has wider geographical distribution.
‘Cloud computing’ is the practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer.
The term 'Fog Computing' was defined by Prof. Jonathan Bar-Magen Numhauser in the year 2011 as part of his PhD dissertation project proposal. In January 2012 he presented the concept in the Third International Congress of Silenced Writings in the University of Alcala and published in an official source.
Also known as edge computing or fogging, fog computing facilitates the operation of compute, storage, and networking services between end devices and cloud computing data centers. While edge computing is typically referred to the location where services are instantiated, fog computing implies distribution of the communication, computation, storage resources, and services on or close to devices and systems in the control of end-users. Fog computing is a medium weight and intermediate level of computing power. Rather than a substitute, fog computing often serves as a complement to cloud computing.
National Institute of Standards and Technology in March, 2018 released a definition of fog computing adopting much of Cisco's commercial terminology as NIST Special Publication 500-325, Fog Computing Conceptual Model, that defines fog computing as a horizontal, physical or virtual resource paradigm that resides between smart end-devices and traditional cloud computing or data center. This paradigm supports vertically-isolated, latency-sensitive applications by providing ubiquitous, scalable, layered, federated, distributed computing, storage, and network connectivity. Thus fog computing is most distinguished by distance from the edge. In the theoretical model of fog computing, fog computing nodes are physically and functionally operative between edge nodes and centralized cloud. Much of the terminology is undefined, including key architectural terms like "smart", and the distinction between fog computing from edge computing is not generally agreed. Fog computing is more energy-efficient than cloud computing.

Standards

IEEE adopted the Fog Computing standards proposed by OpenFog Consortium.