moore-s-law
IT & Systems Management

Are you really using all that CPU power?

This article sprang from some experiments with old technology, which is something I play with in my spare time to avoid having to take up golf or fishing, or buy a mid-life crisis sports car.

I have a 286 PC behind me running DOS 3.30 and Windows 2.03 from 1988. Windows was considered bloatware by many people at the time, compared with the lean efficiency of DOS. It came with Write, a basic word-processing tool, and various other utilities. Its entire footprint is 1.6MB of disk space, and it runs in less than half that amount of RAM. On this old machine it boots to a working desktop in 25 seconds.

I installed it just for amusement value (I do realise I have issues), then discovered I could create basic business graphics, cut and paste them into documents in Write, add text formatting such as bold, underline, italic, indents, etc., and then print the finished document. I don't really need much more than that to do my job (with one major exception that I'll come to later), though a spell-checker would be nice.

So... 27 years on, what exactly have I gained? Yes, I know: lots of stuff, lots of features. Most of it useless to me and to 98.5% of the intended user base. For everyone but that 1.5%, it's merely wasted resources.

Or is it? That was my first conclusion but now I'm not so sure. The editor at IDG Connect also made some valid counter-arguments. So in this article I'll attempt to present both sides of the debate, and then we'd like to hear from you about your experiences and your use – or lack of use – of today's vast computing power.

Note that I'm limiting this to a discussion about desktop, tablet and laptop power. There are plenty of valid uses for today's computing power in server farms and for crunching business data.

A waste of resources

Imagine getting into a time machine in 1990 and fast-forwarding 25 years to the fabulous future of desktop computing. You'd emerge, blinking, into the strip-lighting of the brave new office world, and see... a lot of people sitting at desks in front of computers. The screens would be thinner and wider, the actual computing boxes would be smaller, and the laptops would be svelte beyond belief.

But the keyboards would be the same, the mice would be the same, and there would still be cables everywhere. Hmm.

Now think of the power. Think of Moore's law, with a related prediction stating that computing performance would double every 18 months. That's an improvement of more than two to the power of 16 in 25 years, giving roughly 65,000 times more processing grunt than you had in 1990.

What brave new software would be in use on these vastly powerful machines? Surely it would be like nothing you'd ever seen before, something amazing and holographic and truly AI. You ask the nearest worker what software they're using. "This? Oh, it's Microsoft Word running on Windows."

Word-processing is much easier in a graphical environment than in DOS, which is what made Word for Windows and its WYSIWYG competitors so successful a quarter of a century ago. But for the average user, not much has really changed since then. We write, edit, save, share, print. The difference now is that instead of using 80% of the program's features, we use about 0.08%.

It's the same story with spreadsheets. Try something like As Easy As from the previous millennium, which fits as much power into 500KB as some versions of Excel squeezed into half a CD-ROM.

What about full-screen video? A colleague had that on a 386 PC in 1995, using a plug-in MPEG card. Immersive computer gaming? Try Outcast from the late 1990s. Sure, the graphics are better now, but not as much better as they should be. And the quality of the gameplay has barely shifted: it's just moved online.

For greater contrast, consider the tools that pre-dated Windows. Use something like VDE in DOS and you'll realise that a good assembly language programmer can create something more useful and feature-rich on his own in 70KB than a gaggle of high-level developers can do together in 10,000 times that space.

The exception to all this is the internet, which wasn't around in a recognisable form in 1990. But people still shared files and chatted with each other around the world. Even here, it's not like all that computing power is actually essential. If you want an extreme example, you can even run a web server on a Commodore 64 (though it's not live at the moment) and browse using an IBM XT.

As I sit here writing this feature on a nine-year-old laptop, with a browser, mail client and numerous other programs open in the background, its CPU utilisation is flickering around 1%. All this power is simply wasted on us.

Using every CPU cycle to the max

Most office workers don't really know what goes on underneath their slick operating systems. Lots of optimisations are being made all the time; the software looks ahead to try and predict and prepare for your next actions; dozens of threats are being mitigated every second; programmers have built mountains of code designed to meet every user's requirements; and it looks pretty too.

In fact the IT industry is a big advertisement for unfettered capitalism, as long as you don't look too closely inside some of the factories. Decades of innovation has been largely unchecked by politicians, since most of them don't understand it, so we've reaped huge rewards.

Clunky old desktop slabs from 1990 and so-called laptops weighing more than a large brick have been consigned to history. Now we have sleek, efficient, quiet (remember the noise of the cooling fans?) and above all supremely capable machines that can do just about whatever we want.

It's not underutilisation that's giving low CPU readings: it's efficient design. Software and hardware throttle the CPU when it's not needed, keeping power consumption to a minimum. That's yet another way in which today's machines are so much better than those of 25 years ago.

Gamers have driven much of this progress, by demanding ever more realism in their play, but we've all benefited. The high-end hardware is still expensive, but a good quality multi-purpose computer can be had today for a few days' salary. It'll do everything you need it to do and much more, now and for several years to come.

Video-conferencing, fast and simple transmission of ideas and information, streaming content, massively interconnected systems: all of this benefits us in ways that go beyond mere CPU cycles.

Maybe today's machines aren't 65,000 times better than those of 1990. But the difference is still vast, and we make use of that power in ways that we couldn't even have thought of back then.

Over to you

What do you think? How does your business use all the processing power at its disposal? How do you use your laptops and tablets at home? Have the gains been translated into better productivity or are you just doing the same old things slightly faster?

Get in touch below and let us know what you think.

PREVIOUS ARTICLE

« Google I/O 2015: Some cool stuff but dull overall

NEXT ARTICLE

Red Hat CEO's book centres on importance of being open »
author_image
Alex Cruickshank

Alex Cruickshank has been writing about technology and business since 1994. He has lived in various far-flung places around the world and is now based in Berlin.  

  • Mail

Recommended for You

International Women's Day: We've come a long way, but there's still an awfully long way to go

Charlotte Trueman takes a diverse look at today’s tech landscape.

Trump's trade war and the FANG bubble: Good news for Latin America?

Lewis Page gets down to business across global tech

20 Red-Hot, Pre-IPO companies to watch in 2019 B2B tech - Part 1

Martin Veitch's inside track on today’s tech trends

Poll

Do you think your smartphone is making you a workaholic?