Author Topic: Is Computer Progress at an End?  (Read 1327 times)

Offline Wise Son

  • Honorary Wakandan
  • *****
  • Posts: 3297
  • "intelligent and slightly Black. I'm from the 80s"
    • View Profile
    • My website
Is Computer Progress at an End?
« on: August 02, 2011, 03:05:11 pm »

Progress Hits Snag: Tiny Chips Use Outsize Power

Published: July 31, 2011

For decades, the power of computers has grown at a staggering rate as designers have managed to squeeze ever more and ever tinier transistors onto a silicon chip — doubling the number every two years, on average, and leading the way to increasingly powerful and inexpensive personal computers, laptops and smartphones.

Now, however, researchers fear that this extraordinary acceleration is about to meet its limits. The problem is not that they cannot squeeze more transistors onto the chips — they surely can — but instead, like a city that cannot provide electricity for its entire streetlight system, that all those transistors could require too much power to run economically. They could overheat, too.

The upshot could be that the gadget-crazy populace, accustomed to a retail drumbeat of breathtaking new products, may have to accept next-generation electronics that are only modestly better than their predecessors, rather than exponentially faster, cheaper and more wondrous.

Simply put, the Next Big Thing may take longer to arrive.


The problem has the potential to counteract an important principle in computing that has held true for decades: Moore’s Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.

If that rate of improvement lags, much of the innovation that people have come to take for granted will not happen, or will happen at a much slower pace. There will not be new PCs, new smartphones, new LCD TVs, new MP3 players or whatever might become the new gadget that creates an overnight multibillion-dollar industry and tens of thousands of jobs.

In their paper, Dr. Burger and fellow researchers simulated the electricity used by more than 150 popular microprocessors and estimated that by 2024 computing speed would increase only 7.9 times, on average. By contrast, if there were no limits on the capabilities of the transistors, the maximum potential speedup would be nearly 47 times, the researchers said.

Some scientists disagree, if only because new ideas and designs have repeatedly come along to preserve the computer industry’s rapid pace of improvement. Dr. Dally of Nvidia, for instance, is sanguine about the future of chip design.

“The good news is that the old designs are really inefficient, leaving lots of room for innovation,” he said.

(more in the link)

"Children, if you are tired, keep going; if you are hungry, keep going; if you want to taste freedom, keep going."
-Harriet Tubman