Changing times for clock speed

HAYDEN WALLES
Last updated 05:00 19/02/2013

Relevant offers

Digital Living

Twitter agrees to close accounts for Turkey Reddit in social media turf grab Amazon paying its staff to quit NZ sites at risk from Heartbleed Dotcom restraining orders challenged Mobile banking access at your fingertip Game of spoilers: why Mashable sucks Siri-like Cortana fills Windows Phone gap Google unveils email scanning practices Bitcoin promoter indicted for money laundering

Way back in the dark ages of personal computing, computer power was easily gauged from something called the clock speed.

This number kept going up through the eighties, nineties and even after the turn of the millennium, but during the last part of last decade it more or less plateaued. And yet computers continue to get more powerful. Why did clock speed used to be so important - and why is it not now?

Computers (and digital electronics generally) use an internal clock to synchronise their activities. On each clock tick, data moves from one place to another and basic operations are performed. The faster the clock ticks the more operations can be performed in a given time, and hence the faster the machine. The earliest personal computers, back when Bill Gates and Steve Jobs were barely out of high school, had clocks that ticked at a few megahertz, or million times a second.

In those simple days there was just one clock that governed the entire computer.

At the relatively low clock speeds being used this was fine, but it couldn't last. People wanted more power from their computers. A very direct way to do that was to increase the clock speed, and that's what happened.

Central processing unit (CPU) clock speeds went from a few megahertz in the late 1970s to a few tens of megahertz by the early 1990s. Other computer components - main memory in particular - couldn't keep up with these nippy CPUs and the era of one clock to rule them all was over. Instead there was a special clock just to regulate operations inside the CPU, separate from the regulation of the rest of the system.

But no CPU is an island. It still has to interact with the rest of the system which, from its point of view, operates at glacial speeds. Because of this, modern chips are designed to keep dealings with the outside to a minimum. For example, to cut down accesses to the computer's main memory, which can take an age, the CPU keeps a cache with copies of parts of main memory so that it can access frequently used data quickly. If an operation can't find what it needs in the cache and must retrieve it the long way, CPU doesn't twiddle its thumbs while it waits. Modern CPUs don't perform the basic operations of program one after the other but do several at once, so there is always plenty to do.

In fact, the ability to do many things at once long ago overtook raw clock speed as the driver of increased CPU power. Clock speed might not have risen much in the past few years but computing power certainly has, thanks to parallel processing features that allow individual programs to carry out many basic operations simultaneously and multicore CPUs that allow many programs to be running at the same time.

Ad Feedback

For a long time it was a good rule of thumb that the more hertz a computer had the better, but recent developments have shown that you don't always need more haste to get more speed.

- © Fairfax NZ News

Comments

Special offers

Featured Promotions

Sponsored Content