Friday, April 23, 2010

Big numbers are now much smaller

When I was a child back in the 1960s, one hundred was a big number.

One thousand was a huge number.

And a million -- well that was an altogether staggering amount.

Although the imperial system of weights and measures ruled back then and we didn't have fancy metric stuff like metres, grams or joules -- we still used the metric multipliers such as kilo
and mega. However, that's usually as far as it went.

Our radios tuned into frequencies that were measured in kilocycles (remember, this was before we switched to Hertz) and even the new fangled television only used megacycle frequencies.

When the humble microcomputer first arrived back in the late 1970s its memory was measured in bytes, hundreds of bytes or (if you were really rich), kilobytes.

As you can see 10^6 was just about the maximum multiplier used in those days for almost anything and I recall being the only person in my class at school who knew what a google was.

These days, everyone has heard of Google (although I bet lots still don't know what it means in its mathematical context) and sitting beside me on the desk is a drive with one terabyte of capacity.

In fact, the prefixes "giga" and "tera" have replaced "kilo" when talking about memory and disk storage, and now we also refer to processor speeds in gigahertz rather than the megahertz of those old 8-bit CPUs.

The earliest serial data links provided data transfer at rates that were measured in mere bits per second, now we're talking megabits, hundreds of megabits and even gigabits for today's state of the art wireless, DSL and fibre-based links.

This huge change in multipliers has happened in the relatively short span of just 40-50 years, so where will we be in another half-century?

Well if we extrapolate based on this history, we'd better get used to dealing with prefixes such as peta, exa, zetta and yotta.

No doubt many folks are thinking "what on earth could you possibly do with a zettabyte of memory, that's 10^21 bytes!

Well we live in an age when even those who build the computers and write the software have consistently underestimated the rate of growth to come. Whether it is true or just folk-law, both IBM Chairman Thomas J Watson and Microsoft Chairman Bill Gates have both been rumored (or misquoted) as grossly underestimating both the number of computers that would be needed (Watson) and the amount of memory computer-users would find to be adequate (Gates).

These days, when NASA's Solar Dynamics Observatory is streaming 1.5TB worth of data back to earth each day, it's only a matter of time before we start talking petabytes instead of terabytes.

And, for the record, the human brain is purported to have just 125MB of user-defined memory. That's a very small number (these days).

No comments:

Post a Comment