Moore's Law CPU Scaling "Is Now Dead" Claims NVIDIA VP; GPU Parallel Computing Is The Future

Earlier this month, SlashGear columnist Michael Gartenberg pondered whether Moore's Law was still relevant to PCs; obviously NVIDIA chief scientist and vice president Bill Dally has only just got the memo.  The engineer has penned a guest column for Forbes on the limitations of current CPU technology, and more specifically the fact that – while processor speed has increased pretty much as Moore predicted – the power scaling part of Moore's Law has ended.  "As a result, the CPU scaling predicted by Moore's Law is now dead," Dally suggests, before suggesting that parallel computing will be our saviour.

Advertisement

"Going forward, the critical need is to build energy-efficient parallel computers, sometimes called throughput computers, in which many processing cores, each optimized for efficiency, not serial speed, work together on the solution of a problem. A fundamental advantage of parallel computers is that they efficiently turn more transistors into more performance. Doubling the number of processors causes many programs to go twice as fast. In contrast, doubling the number of transistors in a serial CPU results in a very modest increase in performance–at a tremendous expense in energy." Bill Dally, NVIDIA

Of course, what's the best known parallel computing platform around at the moment: why, it's NVIDIA's own CUDA architecture.  That's now found in GeForce, ION, Quadro and Tesla GPUs from the company, and can be turned to not just graphics crunching but general processing.

Advertisement

The problem with multicore processors of the sort that Intel and AMD are producing, Dally says, is that they consume too much energy per instruction; because of that it's pointless trying to hook up several of them in parallel.  To be fair, Dally's points do make some sense when you consider the generally growing power requirements for high-performance chips; however, we don't think Intel or AMD will be quite so ready to go along with the NVIDIA VP's suggestions.

Recommended

Advertisement