Not every application is ideally carried out centrally via the cloud. Direct data processing is often more suitable. This is especially true when time is of the essence and the results need to be distributed quickly. Such local solutions are summarised by the generic term of fog computing.
The most often cited example for the effective use of fog computing is the self-driving vehicle: To allow it to independently steer through traffic, it must constantly obtain data about its surroundings and the neighbouring cars; at the same time, ideally it should pass on data about its own location and warnings to other road users. Even when, for example, it is forced to brake hard because of a wild lane change, the information should be reported via car-to-car communication to the other cars in the area to avoid a rear-end collision.
Making contacts in the Internet of Things
It does not make sense, in this scenario, first to pass on data on the current speed to a remote central cloud data centre that then analyses the data and, sending it back down the same path, alerts the trailing cars to the vehicle braking in front of them. After all, remote through-roads and rural roads in particular often lack Internet connections. Moreover, the difference between life and death in such situations is often a matter of tenths of a second. By the time the data is sent to the cloud, processed and the results sent back, it may be too late.
These are the kind of reflections that gave rise to the concept of fog computing, also sometimes referred to as edge computing. It is aimed above all at the Internet of Things (IoT), where in the future billions of mostly small devices and sensors will be exchanging data with each other. Many of them have only minimal processing power or none whatsoever. Until now, for that reason, the idea was that they should transmit their data to powerful cloud servers and let them do the processing.
More computing power, less power consumption
But now the picture has changed. The newer generations of mobile processors work with quad-core power and thus provide high computing power while being small with low energy consumption. For applications in the automotive sector, as described in the above scenario, that is perfect.
The providers of route-planning software for mobile devices are already exploiting this fact. While programs previously tended to transfer the task of calculating the best route from point A to point B to the cloud, this task is now being performed locally on the device. This means that the applications can be used even when no network happens to be available. Since less data needs to be transferred, this also results in cost-savings, especially abroad.
First the router, then data points
The term ‘fog computing’ comes from Cisco. In using it, the network equipment supplier vividly alludes to the difference between the distant cloud in the sky and their own, closer-to-the-ground concept. The alternative model is now the local foggy patch close to where the work must be done. Cisco wants to expand its routers in such a manner that they can perform most of the calculations that arise in the IoT and contact the cloud only when absolutely necessary. Following the above example, that would be the case if the connected vehicle is involved in an accident and has to notify rescue services.
Today, about one and a half years after the term emerged, fog computing has, however, already emancipated itself from the routers and presents itself as an interesting alternative to the centralised cloud model. (rf)