Why did CPUs stop getting faster about 5 years ago?
December 9, 2007 4:35 PM Subscribe
Why are CPUs not getting faster any more? This plot
is some data I scraped from a few sources around the web. I know it's not exhaustive, but I'm looking to pick out long-term and general trends. It's clear to me that processor clock speed has plateau'ed around roughly 4 GHz. Why?
posted by sergeant sandwich to Computers & Internet (30 answers total) 17 users marked this as a favorite
I am aware of Moore's law, and the megahertz myth, and things like bus speeds and cache sizes and instruction sets and that comparing clock speeds across processors isn't especially meaningful. I know that Intel stopped pumping up the clock cycles, I know about pipelining and predictive branching and multiple cores and which bottlenecks are where in a computer.. I understand that despite the CPU speed flattening, actual computing power has continued to increase.
However, surely if a chip designer can run the core at a faster clock, they would. Why can't they?
My understanding was that this was essentially an issue of thermal management: faster switching + fixed settle time = more current = more resistive heating.
However, someone else pointed out to me recently that this might be an issue with the RC time constant of the interconnects on the chip.
Ideally i'd like to find an article about this phenomenon from an EE/physics point of view, preferably from someone in the industry. Most preferable would be in a journal or IEEE publication; a trade magazine would be good too. An article in something like Wired would be okay, but I need to cite a source and the popular press is notoriously bad when it comes to this sort of thing.
However, my googlefu is really failing me here, so any explanation or pointer to search terms, or really anything of help would be greatly appreciated.