For many years if you wanted to use a computer then you had to put up with fan noise. A lot of newer computers like tablets have changed all that, but although the fan itself is a very low-tech piece of gear, eliminating it often requires some very high-tech developments.
The fan in a computer serves the same purpose as a fan in your car or house: Cooling. Like pretty much any machine that does useful work, electronics generate heat as a by-product and the harder they work the more heat they generate. In the case of a computer hard work consists of moving lots of data around, doing lots of calculations and making lots of decisions.
In the earliest days of personal computing in the late 1970s excess heat wasn't a problem. Substantial heat was generated but it safely dissipated into the surrounding air. Desktop computers of the 1980s came equipped with a fan that ventilated the entire inside of the case.
When the competition for faster and faster computer processors started heating up, so did the processors themselves. With processor speeds rising exponentially cooling quickly became an issue. By the 1990s processors needed big spiky metal heat sinks - metal to conduct heat away from the electronics speedily and spiky to increase the surface area exposed to the air which encourages heat dissipation. Eventually even this wasn't enough and processors needed their own little fans without which the processor would burn out in just a few seconds.
You'll still find fans in lots of computers but many modern computers, like tablets, don't need them, even though they are still more powerful than computers from a few years ago that depended on bulky cooling. How does that work?
One way that newer electronics generate less unwanted heat is that they are just more energy efficient, doing the same job as previous generations with less electricity and generating less heat as a result. Chip makers are constantly striving to shrink the circuits on their chips. As the microscopic electronic components get smaller and closer together, they need less power to operate.
Another way modern chips generate less heat is by carefully shepherding available energy. Twenty years ago computers ploughed through every task like a ship in full sail. Everything was on, all the time. But most of the time a computer is actually idle, waiting for the user to do something, and even when doing something a modern computer typically uses only a fraction of its full capacity. At those times part or even all of the machine can be throttled back or even switched off completely. A lot of work has gone into designing electronics that scale their power use moment to moment according to the load placed on them. And with less power going in, less waste heat goes out.
Temperature is not the sole motivation for these developments. Cutting power consumption also makes batteries last longer. And it's just a good way to save a little bit of electricity which, in these energy-conscious times, will attract a lot of fans. Just not the spinning kind that blow air.
- © Fairfax NZ News