What does the connected future have in store?
By 2025, 150 billion objects will be connected and will form the Internet of Things (IoT) worldwide (1). Most of these objects will produce data in real time, representing 30% of the datasphere. And this one-upmanship won’t be caused just by machines. Instead, every single one of us will be connected, producing an average of 4 900 times the current amount of data every day, i.e. one digital interaction every 18 seconds!
Whether in industrial environments, with the revolution of Industry 4.0, in the world that hosts us (smart cities and smart buildings), in our cars that will be connected before they become autonomous, or in our daily personal lives with our smartphones and e-health, the IoT and connected objects will be ubiquitous. The result will be an explosion of data volumes in environments that are currently far from being able to support them.
We’re already wondering how the capacity of our information systems (IS) will be able to bear this data overload. Our legacy of centralised IT, located physically within companies, hosted by partners or in the public cloud, is not suited to handling and processing connected data. Furthermore, the infrastructures supporting, transporting and processing IoT data are already struggling with bottlenecks and latency issues. Latency, however, is not compatible with the other phenomenon associated to the IoT: real time.
But what is Edge Computing?
In its study, IDC uses an interesting representation of the spread of data, from 'endpoints' (users, cell phones, computers, API, IoT, vehicles, etc.) to the 'core' (heart) of corporate IT (data centres, private cloud and public cloud), and vice versa. Between these two extreme layers is an intermediate one, called the Edge which is located on the outskirts of the IT core or its uses. With respect to the world of the IoT, which is directly concerned, the core is the datacentre or cloud infrastructure made available to data to store and analyse it; and the endpoints are sensors and connected and "intelligent” things. Between the two are communication tools, the network infrastructure and the Internet, which transport the data. The core is on a specific point of the planet, while the endpoints are everywhere. This explains the latencies and data transport and processing times!
With Edge Computing, data storage and processing capacities are placed in the intermediate layer. In concrete terms, this means installing or using proximity relays, including the cloud of local operators or micro-datacentres, with data storage capacity and analysis of the data transmitted. Thus, part of the functions proposed by the 'core' are delocalised to the edge of the IoT. They can, for example, analyse the data that is circulating and ignore the mass of data that are of no interest (messages indicating sensor operation are of no interest, unlike the rare incident, error, overload, etc. messages) and transmit only data with value for their processing in the core, which reduces significantly bandwidth. Or, analytics or AI (Artificial Intelligence) processing can be applied locally, to produce information in quasi real time.