Friday, December 3, 2010

IBM to be arrested for breaking Moore's Law?

Moore's law has been one of the most enduring truisms of the computer era.

For those who aren't familiar with Moore's Law (and there must be at least one person in the world who's been sleeping for the past 45 years), Gordon Moore of Intel once claimed that the number of transistors that could be scaled onto a single chip would double ever 12 months.

And you know what? He was dead right.

The effect of this non-linear growth in the density of integrated circuits is that computing power has grown roughly in unison to this ever-increasing complexity of the chips that comprise our modern CPUs.

However, physics looked as if it was catching up to Moore's law and might even render it invalid.

The problem is that in order to increase the number of transistors and provide faster computer chips, the size of each individual transistor and the paths that connect them has been shrinking.

Eventually, we must reach a point where we just can't make things any smaller, for a number of reasons.

Once things get too small, the weired world of quantum effects start to replace those of classic science. Instead of flowing smoothly from junction to junction, as they do on larger scales, electrons start falling prey to those quantum effects, with certainty replaced by probability.

Those who develop our computer chips have been warning for some time that we're rapidly approaching the threshold of miniaturization that will effectively create a wall to the seemingly endless rule of Moore's law.

However, when faced with a wall that blocks your path, what do you do?

The smart money says that instead of banging your head against it and drawing blood, you're much better to set off in a new direction, one where the wall won't be a problem.

And indeed, that's exactly what some researchers are doing. What's more, these scientists claim that when they succeed, Moore's law will be eclipsed by new laws that dictate a massive leap in computing power hitherto unimagined by Gordon Moore or any of his fellow workers at Intel.

Researchers at IBM now claim that the future of computing is something they call "nanophotonics" (hooray, a new buzzword!).

The increase in raw computing power that this blend of conventional silicon technology and optical computing promises to deliver is mind-boggling.

IBM says that its nanophotonic processors will deliver speeds measured in exaflops (10^18 floating-point operations per second) rather than the current record-holding super-computer performance of a meagre 2.67 petaflops (2.762 * 10^15 floating point instructions per second).

The first of IBM's new computers using this technology is due to be installed next year and, although it will deliver a humble 10 petaflops, it will do so using far less space and power than a conventional computer this size would require.

So the future of super-computing is bright -- quite literally.

And Moore's law?

Well that may become a quaint relic of the 20th century -- itself eclipsed and replaced by a new law that defines much faster growth over shorter timeframes.

I love this industry.

No comments:

Post a Comment