The IoT, i.e. the Internet of Things, is changing the IT landscape globally. Even more than the networking of everyday objects, the so-called Smart Factory poses new challenges to traditional Cloud architectures. In industry 4.0, the IoT is becoming a crucial technology for industrial plants. The resulting problem is that the concrete and complete creation of a Smart Factory could generate several hundreds of Gigabytes of data every day. These are volumes of data that currently, with the technologies available, can neither be uploaded wirelessly to the Cloud nor be processed centrally.
Fog Computing offers a resolutive approach to these problems with implementing the IoT. What is fog computing? A definition. Fog Computing is a Cloud technology in which the data generated by the devices are not uploaded directly to the Cloud but are first pre-processed in decentralized mini data centers. Such a system requires a structure that extends from the external borders of the network, where IoT devices generate data, to the primary data endpoint in the Public Cloud or a private data center (Private Cloud). The “Fog Computing” marketing concept was pioneered by US technology company Cisco, a world leader in network solutions.
The term is a successful metaphor: we distinguish between fog (fog in English) or clouds (clouds in English) to indicate an accumulation of tiny particles of water, depending on the height at which the phenomenon occurs. Returning to IT architectures, Fog Computing brings data processing processes back to ground level through the so-called Fog nodes. These computing nodes act as intermediaries between the Cloud and the various IoT devices.
The so-called “fogging” goal is to shorten communication paths and reduce data flow through external networks. Fog nodes create an intermediate web layer where data is processed locally. In contrast, others are forwarded to the Cloud or a central data center for further analysis or processing. The following graph schematically represents the three levels (layers) of a Fog Computing architecture:
Fog computing differs from cloud computing, where sources are provided, and statistics are processed. Cloud computing takes typically place in centralized data centers. Resources inclusive of computing power and reminiscence are pooled and made available using back-stop servers for clients in the community.
Communication among different terminals always takes location via a background server. This kind of architecture demonstrates its limits in tasks, including that of the “clever factory”, in which there is a non-stop alternative of information among infinite cease gadgets. Fog computing is based on intermediate stage processing close to the source of the fact to lessen record flow to the information center.
It is not simply the volume of big IoT architectures that makes cloud computing attain its limits. A similar problem is posed using the latency time. Centralized fact processing regularly results in a put off due to long transmission paths. The terminals and sensors should connect with the records center server to speak and watch for each the outside processing of the request and the reaction.
Latency instances of this type are trouble, as an instance, in not-supported manufacturing strategies, wherein real-time facts processing is critical to react at once to any incidents. One form of technique that aims to remedy latency is side computing. This idea derives from fog computing, where information processing is decentralized but done directly using the final device and, therefore, on the network’s border (“side” ). Every “smart tool” is ready with a microcontroller that allows number one data processing and verbal exchange with other gadgets and sensors.
The IoT of today cannot be compared to that of tomorrow. According to a Cisco research, in 2020, the Internet of Things will affect about 50 billion devices, and the volume of data to be stored, analyzed and prepared for further processing will be equally high. The smart factory is not the only field of application for fog computing. Also, other future projects, such as connected cars (semi-autonomous or driverless cars) or smart cities, cities with intelligent power grids, will require real-time data analysis, which is impossible in classic Cloud Computing.
For example, an intelligent vehicle will be able to collect data on its surroundings, driving conditions and traffic, which will need to be evaluated without latency to react promptly to unpredictable accidents. In a similar scenario, Fog Computing will allow vehicle data processing both within the vehicle itself and at the service provider.
Fog Computing offers solution approaches for various problems in Cloud-based IT infrastructures. These systems shorten the communication paths and minimize the upload to the Cloud. However, decentralized computing at the edge of the network also has disadvantages, mainly stemming from the maintenance and administration costs of a shared system.
By 2020, the new standard for 5G mobile communications with download speeds of up to 10 Gbit / s will be introduced across Europe. Experts predict excessive growth in data, especially in the professional sector. With the 5G system, the bandwidth and speed of mobile data transmission will increase enormously and open entirely new application possibilities in the industry and the service sector. 5G promises users a sub-millisecond latency time.
With current Cloud technologies, however, the advantages of the new mobile communication standard would not be exploited effectively, for example, due to the number of Hops that a data packet requires from the source to the destination in the Cloud. Therefore, the direct upload of data to the Cloud is not suitable for applications that require data processing in real-time. Thanks to the method proposed by Fog-Computing, 5G becomes usable in the industrial sector.
A shared Fog system provides computing power and storage capacity at the network’s edge. In this way, the data generated by business applications can be evaluated locally, selected and aggregated for the Cloud and ensures that actual results, such as the emergency shutdown command of a production line, are returned in real-time. Only data that is difficult or impossible to evaluate locally or requires more detailed analysis, such as unexpected measured values indicating that the machine needs maintenance, is sent to the Cloud.
Also Read: FROM BLOCKCHAIN TO AI: THE TECH TRENDS TO FOCUS ON IN 2022
There is so much praise for Free's latest technological innovation. Its new box aims to…
Mobile computers and terminals are now indispensable tools for various companies and sectors, including logistics,…
The apprenticeship contract is an excellent way to put into practice what you have learned…
The most popular app at the moment is undoubtedly NGL, but it is not the…
Communication by email has today become essential as a means of contact in our daily…
In a setting in which digital dangers are turning out to be progressively modern and…