From Fairchild to 5G: the history of Edge Computing

An in-depth exploration of the history of edge computing, from its monolithic origins in the 1960s through to its use with IoT devices today.

IDGConnect_edgecomputing_shutterstock_1040447611_1200x675
Shutterstock

The simplest definition for Edge Computing is that it brings both storage and compute resources nearer to where data is generated. At first this description belies the magnitude of what the technology really means for how we gather, process, store and use data. Data holds the key to digital transformation, as well as the success of 5G and many other technologies featured in tech trends lists for 2021 and beyond.

To extract the immense value from data, it has to be analysed, used and stored effectively, but this becomes a greater challenge each year; as data becomes increasingly important, its growing vastness makes it harder to harness and use. This is where Edge has the potential to unlock digital transformation for organisations across the globe.

Traditionally, IT infrastructure requires data to be transmitted over large distances – journeys that often also mean traversing numerous network connections along the way. This process not only requires a lot of hardware, but a lot of expensive bandwidth as well, presenting major cost and efficiency challenges for small and medium sized businesses in particular. By bringing storage and compute power to the source of the data, Edge removes the requirement for expensive hardware and bandwidth. It simultaneously provides the user on the ground with real-time data insights, in a more reliable way with higher availability.

With the scene set to explore this revolutionary technology, we can now look at some of the foundational building blocks in its history through to the present-day, as well as delving into the details of how the technology works, before concluding with a spotlight on enterprise applications and the future.

Monolithic origins

In the early days of organisations adopting mainframe computer systems, they were vast and extremely expensive. The large organisations capable of spending hundreds of thousands of dollars on one would need to dedicate entire rooms to housing them, while carrying out all network computing via a physical datacentre. Systems required staff to carry out manual data entry, processing power was low, and customisation was very limited.

To continue reading this article register now