Network Management

Edge computing 101: A CIO demystification guide

Every so often in the IT industry, a new buzzword crops up and everyone appears to rush to jump on the bandwagon, even if the buzzword turns out to be little more than a new twist on ideas and practices that may have been in use for decades.

The latest trend in this phenomenon is edge computing, which is being touted by some commentators as the Next Big Thing, and discussions have been raging over whether it will replace cloud computing and whether or not it represents a new multi-billion dollar market opportunity, along the lines of the internet of things (IoT).

A straightforward explanation of edge computing is that it simply involves some processing at the edge of the network, as opposed to consolidating all processing power at the core of the network.  As with other buzzwords, an exact definition may be elusive, but a good starting point is to simply think of it as putting the processing power at the point of action, or alternatively, putting the processing power in the most appropriate place for the specific application.

Confused by what containers are, the options available and whether they’ll ever replace virtual machines? Check out: Containers: Everything you need to know

“I would say that edge computing has probably two, three or four different definitions, depending upon whom you are talking to,” says Tony Lock, distinguished analyst at Freeform Dynamics.

To continue reading...


« How business drones can be deployed way beyond delivery


Blockchain market 101: A $7.74B tech & vendor potential »
Dan Robinson

Dan Robinson has over 20 years of experience as an IT journalist, covering everything from smartphones to IBM mainframes and supercomputers as well as the Windows PC industry. Based in the UK, Dan has a background in electronics and a BSc Hons in Information Technology.

  • Mail


Do you think your smartphone is making you a workaholic?