motherboard-fire
Green Business

Wanted: a new era in era of low-heat computing

Here are two interesting stats:

First, according to NASA and the NOAA, 2014 was the warmest year for planet Earth since records began.

Second, according to Gartner, the ICT industry accounts for two per cent of global CO2 emissions, though that was in 2007 – it's probably over three per cent now.

That percentage of global CO2 is roughly the same as is generated by the aviation industry. If all of it was a by-product of raw computing power, if every molecule of CO2 produced represented 100 per cent efficient computing, there might not be much of a problem. After all, the global economy depends on computers. It's hard to think of a single industry that hasn't incorporated ICT at some point in its supply chain.

Unfortunately, ICT is not energy-efficient. Far from it. As any server administrator will know, computers produce two things: calculations and heat. Sometimes it seems as though they produce more of the latter than the former. In a conventional server farm the cooling apparatus can use as much electricity as the servers themselves.

Any student of physics knows that heat is the default state of energy. As entropy increases, all other types of energy – kinetic, electrical, gravitational, et cetera – degrade into heat. It takes a lot of work to convert heat into other energy states efficiently, but very little work to convert other states of energy into heat. Eventually, many billions of years from now (hopefully), all that will be left in the universe will be a gentle, toasty background glow of heat.

The World Economic Forum, which has previously pondered topics such as the inefficiency of incandescent light bulbs, recently debated the heating effect due to computation, though without firm conclusions or recommendations. ICT is generally more energy efficient than such bulbs, which can expend more than 90% of their consumed energy as heat.

But the difference isn't great. Your smartphone might be 30% efficient, for example, which is one reason why you have to charge it so often. In fact your iPhone could be using more energy than your fridge. And as for that PC on your desk, it wastes so much thermal energy that you could use it to heat your home in winter, or at least one room.

Engineers spend much of their time fighting entropy and unwanted heat generation, especially in computing. Shifting electrons around tiny processors and other chips at near light speed requires a lot of energy and generates a lot of heat as an unwanted by-product. It's an ever-increasing problem and it places an upper limit on computing performance. But in some parts of the world that problem is being turned into a solution to other problems.

For example, some companies in Germany are being offered the chance to heat their offices using the thermal by-product of cloud servers, while Amazon is planning to heat its own buildings in a similar way. Other companies are using heat generated by their server farms to provide warmth to local residents' homes in winter. Facebook and some other large companies are taking a different approach, and simply building new datacentres in cold countries to save on cooling costs.

Still, that heat has to go somewhere. Heating homes in winter is great, but what happens in summer? Those homes will still need hot water, of course, but not as much. You could convert the excess heat back into electricity, but the temperatures involved aren't high enough for efficient state transition, so we're back to entropy again. Even burying a server farm in Arctic tundra has its effects. It may save on the air-conditioning bill, but the heat generated will raise the temperature of the surrounding environment, with unknown consequences.

However you view the concept of anthropocentric climate change, the waste heat energy thrown out by ICT is a problem. It's unwanted, it's expensive to manage and it slows down calculations. The solutions mentioned here are imaginative and useful, but they are stop-gaps only, though some companies have ambitious plans to take things further.

What's really needed is a new form of computing that's far more energy efficient. We need the equivalent of an LED light bulb for ICT, something that boosts efficiency by an order of magnitude at a single stroke.

There doesn't seem to be anything on the immediate horizon that fits the bill, but that doesn't mean it won't happen. Daylight-wavelength LED light bulbs only became viable decades after the all-important blue LED was invented. Perhaps the solution is already out there, but not quite ready for widespread use.

A new era of low-heat computing? I wouldn't bet against it. The rewards for whoever invents such a cool technology would be huge.

PREVIOUS ARTICLE

« Southeast Asia: Is a "Circular Economy" the key for growth?

NEXT ARTICLE

IBM opens second Africa research lab in Jo'Burg »
author_image
Alex Cruickshank

Alex Cruickshank has been writing about technology and business since 1994. He has lived in various far-flung places around the world and is now based in Berlin.  

  • Mail

Poll

Do you think your smartphone is making you a workaholic?