Green Business

Roger Schmidt (Global) - Utilization Is the Name of the Game for Energy Efficiency

Information Technology may seem inherently green and energy efficient - electrons and photons zipping through semiconductors and optical fibers do most of the work, there's nary a polluted waste-stream nor belching smokestack in sight. But computers and storage require electricity to operate, and because there are so many datacenters out there, they require quite a lot of electricity overall - about 1% of the world's total demand.

In the US, datacenters consume more electricity than is used in all metal products fabrication. And computing is growing rapidly, so computer engineers and datacenter operators need to focus more of their attention on reducing power consumption and using the power more efficiently. It's good for the planet, and it's good for their bottom lines.

So far, there's good news and bad news. The good news is that engineers have made astounding leaps forward in making chips more energy efficient. Datacenters are getting much bigger, which tends to increase the opportunities to make them run more efficiently. The bad news is that most computers and disk drives keep running, even when they aren't doing any work. Engineers and software designers have a long way to go in making every cycle produce work. Even in well run datacenters, many computers run at less than 15% of their full capacity.

Think of a datacenter as a car. The automotive engineers have improved aerodynamics, and optimized fuel efficiency with electronic fuel injectors and finely metered air intakes to get to 30 miles per gallon. But if the driver leaves the car idling in the driveway for long periods, the mileage rating steadily drops because the car isn't moving. A datacenter with highly efficient computers isn't operating well unless those computers are producing work with each clock cycle.

In examining the energy picture it's helpful to understand where the energy goes. In a typical datacenter, half the energy that comes in is needed to operate the servers and storage devices and networks that connect them. Another one-third is needed to operate the air conditioners and chillers that keep the computers from overheating. And one-sixth of the energy is lost as the electrical current from outside goes through transformers that reduce the voltage to levels computers can use. Analysts say that electricity accounts for 75% of the operating cost of a data center.

Fortunately, industry trends are creating a virtuous cycle of improved performance and reduced power use.

Mobility is the hottest trend in technology today. Everyone in the world wants to use their smartphones to accomplish tasks that used to require PCs. The chip that runs a smartphone is 1,000 times more powerful, and 100,000 times smaller than the single computer that operated at MIT in 1965. Smartphones have to work for a long time on very small batteries. So electrical engineers have been forced to design chips that sip power instead of gulping it. Those techniques are making their way into PCs and even servers.

Another hot trend is Cloud computing, in which most of the computing needed by a corporation or government is done in large Cloud-computing centers. There the computing is done by hundreds of computers, with incoming work assigned to computers that are underutilized. Cloud computing becomes even more efficient when many different companies are sharing a datacenter, and each firm's workload is virtualized to be handled on the appropriate number of servers. Different industries have different peak times, with retailers topping out in December, accounting firms in April, and airlines in the summer travel season.

Cloud computing leads to the replacement of thousands of small datacenters with just a few dozen large ones. Size breeds efficiency in datacenters just as it does on farms and auto assembly plants. In terms of electricity use, the larger datacenters' big floor plans provide a cost justification for extensive energy audits to optimize airflow and avoid hot spots.

However, even in Cloud computing centers with virtualized servers many servers aren't doing useful work much of the time. The next frontier in energy efficiency is installing software that monitors datacenters and looks for under-utilized servers. It can automatically move work to fully load the most efficient servers in the datacenter and shut down other servers until they are needed. That sort of load balancing has long been standard practice in expensive, multiprocessor mainframe computers. Now software engineers are successfully applying the same virtualization techniques across multiple servers within a Cloud computing center.

Reducing electricity use is a growing concern for companies that need computing power. Technology has made huge strides in reducing consumption. But datacenter managers need to be vigilant about managing the computers they have to make sure they aren't burning energy without producing useful work.

By Roger Schmidt, IBM Fellow and Chief Engineer on Data Center Energy Efficiency




« Michael Kogeler (Middle East) - Content Consumption in the Middle East


41% of Organisations Have no Big Data Plans in Egypt and Morocco »

Recommended for You

International Women's Day: We've come a long way, but there's still an awfully long way to go

Charlotte Trueman takes a diverse look at today’s tech landscape.

Trump's trade war and the FANG bubble: Good news for Latin America?

Lewis Page gets down to business across global tech

20 Red-Hot, Pre-IPO companies to watch in 2019 B2B tech - Part 1

Martin Veitch's inside track on today’s tech trends


Do you think your smartphone is making you a workaholic?