Kirk Cameron Cuts Computers' Power Use

Supercomputers—and PCs—That Sip Power

After attending a lecture on cutting power consumption of electronic devices, Kirk Cameron a decade ago started pondering electricity usage in the supercomputers he studied. Powering a big machine of that era, he discovered, cost almost $8 million a year, and technology on the horizon might eat up 10 times as much electricity. “That scared me,” says Cameron, a computer science professor at Virginia Tech. “The conventional wisdom at the time was that power would not be an issue.”

The realization spurred Cameron, now 40, to develop power management software that today is used on supercomputers—and now PCs—worldwide. While other power-saving programs do little more than turn off the monitor and largely ignore a computer’s central processing unit, Cameron says, his software manages the CPU much like a dimmer controls a light fixture. For reading, you’ll want full light, but for a romantic dinner, low light is just fine. “You tell the software how aggressive you want it to be,” says Cameron, a lifelong computer buff who started playing Pong on his grandfather’s Atari at age 5.