Friday, October 30, 2009

Internet and aeroplanes, some interesting parallels

No, I'm not writing about the move to equip long-haul flights with wireless internet connections for passengers and their laptops.

I'm talking about the fact that this week the internet celebrated its 40th birthday.

From some crudely interconnected computers back in 1969 to the now ubiquitous "world wide web", the internet has grown far beyond anything that Leonard Lleinrock ever envisaged when, forty years ago, he and some colleagues started hooking a few computers together.

Of course the first thing that happened with that hookup was that the login application crashed. Thank goodness we now have vastly improved software for without it, the Net would never have gotten off the ground.

Like many things relating to computers, the growth of the net has been somewhat exponential and after that initial 1969 hookup, things progressed only very slowly. In fact, even 12 years later in 1983 there were still little more than 200 computers comprising the whole internet.

One of the major growth spurts took place in the late 1990s, when the ready availability of low-cost dial-up modems and an abundance of ISPs saw increasing numbers of "ordinary people" discovering the delights that were to be had online.

Today it is estimated that there are some 1.7 billion people who use the internet.

But what's this got to do with aircraft?

Well strangely enough, aviation seems to have followed a similar growth path.

Once the Wright brothers established the viability of heavier-than-air flight back in 1903, the growth of the fledgling aviation industry slowly began to grow and expand.

It took almost forty years before commercial passenger flights became commonplace and recreational aviation was an affordable pastime.

If we compare the state of the aviation industry in 1943 with that of the current day, perhaps we get a little insight into where the internet may be headed.

These days, people think nothing of committing their lives and valuables to regular long-haul flights and flying has become just another part of living in the 21st century.

Perhaps the internet will continue to invade our businesses and all aspects of our day-to-day lives.

While some approach cloud-computing with the same trepidation that those early pioneers approached leaving the safety of the ground, it's only a matter of time before we feel confident in committing all our valuable data, information and services to "the ether" that makes up
the internet.

And, just as the aviation industry became a key component of war in the 1940s, it now seems that the internet constitutes a new frontline battlefield -- with cyberwarfare and cyberterrorism "hot topics" within the halls of many nations' defense departments.

Just as recent military conflicts have shown the crucial importance of air-power in war, so we will see online strategies and weapons becoming increasingly important in future battles.

That there are so many similarities between the growth of aviation and the growth of the internet seems to indicate that the human race has a tenacious ability to harness technology and adapt it quickly to the needs of the moment - that bodes well for our future on the planet.

Although most people recognise the internet as a huge step forward in our technological evolution, I'm picking that we have yet to feel the full import of its might. The best is yet to come.

The synergy that the combined knowledge and expertise the Net can bring together may well see huge advances in other areas of science and technology.

Now that we really do have a global community of scientists and researchers who can collaborate over great distances in real-time, look out for even more impressive breakthroughs in a wide range of fields.

The internet is big -- but it's going to get a lot bigger.

Comparatively speaking, we're still using Tiger Moths and DC3s to move our data around. It's only a matter of time before we roll out the cyberspace equivalents of the Jumbo Jet and the Concorde. That's when the real fun will begin!

I just hope I live long enough to see a good deal of the internet's future. However, I've already been asked "what was it like before the internet?" by several inquiring youngsters.

Oh dear, just like the internet, I must be getting old.

Friday, October 23, 2009

The (solid) state of hard-drive storage

What part of your system has more moving bits than any other?

What's the most unreliable yet valuable part of your computer system?

What part of your system is probably the biggest bottle-neck to the overal performance of your computer?

Most people will realise that I'm talking about the hard-drive.

The basic design of the hard drive hasn't changed for decades. It consists of one or more metal disks that spin around at a great number of RPMs (over 10,000 in more recent hi-performance drives) and a number of tiny heads that float less than a hair's breadth above those platters in a way that represents a disaster just waiting to happen.

Imagine a modern jet-fighter flying at the speed of sound just a couple of metres above the ground - well if you were to scale up the internals of your hard-drive, that's the kind of action we're talking about.

And, like any device that contains moving parts, your hard drive will eventually wear out -- that's if the heat generated by its electronics and motor don't kill it first.

This, of course, is why we make backups and use RAID arrays when storing important data.

However, the days of the now ancient mechanical hard-drive are drawing to a close and already we're seeing a family of (netbook) computers that ship without them, yet offer the ability to store gigabytes of programs and data.

Enter the age of the Solid State Drive (SSD).

Thanks to rapid advances in non-volatile memory technology, it's now possible to create solid state storage systems that emulate a regular SATA hard drive -- but without all the vulnerable mechanics or moving parts.

Think of them as an uber-USB drive -- but much bigger and faster.

SSDs have numerous advantages over conventional hard drives and those include far greater reliability, greatly improved access speeds, less vulnerability to unexpected power failures, potentially lower power consumption and faster read/write data rates.

These solid-state drives are also far less susceptible to the effects of physical shock or extremes of temperature and humidity. In effect, they're almost the perfect storage medium.

The only real downside at present is their somewhat higher price than mechanical drives -- but that will also change over time.

Already available in regular hard-drive form-factors and with the ubiquitous SATA electrical interface, it's now possible to perform "plug and play" replacements with almost any desktop or laptop computer, swapping the existing drive for an SSD equivalent of up to 1TB.

So why has it taken so long for SSDs to catch on? Apart from price, what's been holding them back?

Well part of that boils down to a small issue called write-cycle-life or "write endurance".

The problem with Flash memory (which most SSDs use) is that you can only write to it a fixed number of times before the computer equivalent of alzheimer's sets in. Once the write-cycle life of a Flash memory cell is exceeded, it can no longer reliably hold data and the write may either fail or random bit-flipping can occur.

Early Flash memory suffered from a write-cycle life that was measured in just thousands of write operations but this figure has slowly improved and now most good quality silicon has a life measured in millions of individual write-cycles.

Of course you might think that's still an awfully small number, when you work out how quickly computers like to flip bits and how often data gets written to a hard drive -- but things really aren't that bad.

That's because the makers of SSDs have implemented some clever schemes to mitigate the limited write-cycle life of the devices used.

For a start, they often use a level of indirection that ensures the write operations are spread out across the entire array of memory. This means that when you re-write a sector to the SSD, it may actually be written to a totally different part of the memory array to that it originally occupied. This is done so as to spread those write-cycles out and not excessively "wear" any given set of memory locations.

Some pretty sophisticated caching is also done using conventional RAM so that when a sector is constantly being re-written, it's not committed to actual Flash ram until the updates are completed or some other condition (such as a power failure) is signaled.

By use of these clever tactics, the effective write-cycle life of an SSD in regular use already exceeds the MTTF for a conventional hard drive my at least an order of magnitude.

Bearing all this in mind it becomes pretty obvious that the days of the existing whiny, spinning, vulnerable mechanical hard drive are numbered and that your next PC (or maybe the one after that) will almost certainly come fitted with a solid-state drive.

The arrival of the affordable SSD may do more for the effective power of the desktop computer over the next five or six years than enhancements in CPU architecture can contribute. The bottleneck and comparative unreliability of the slow hard-drive is about to be eliminated by some clever silicon.

Cheaper, faster, smaller -- when it comes to computers and storage its the way the future's shaping up.

Isn't progress a wonderful thing?

Friday, October 16, 2009

How eBooks will change the publishing industry

It would appear that, after a very long gestation period, the e-book reader is finally catching on.

Not only has Amazon enjoyed very healthy sales of its Kindle but a number of other vendors have announced the launch (or imminent launch) of competing products.

Of course this won't spell the end of ink-stained paper any time soon, but it will force some major adjustment by publishers and distributors alike -- an adjustment some may be reluctant to embrace.

The book publishing industry is rapidly facing the same situation that was faced by the recording and movie industries a decade or so ago and it will be very interesting to see if they've learned anything by watching the fiasco that resulted there.

Until now, book publishers have been protected from the effects of the digital age, simply because their wares were not readily available in digital form -- and that meant one very important thing:

The cost of copying exceeded the cost of buying a genuine edition.

This was also the case for music and movies right up until the arrival of CD/DVD burners and the internet. Sure, you could borrow a friends LP or CD and copy it to cassette tape but that was a slow process and the second-generation copy was always inferior to the original. It was only when the original material was provided in digital format and true bit-for-bit copying was affordable that any real threat was posed to the (then) existing business models.

Well now we have the emergence of truly practical e-book readers and Google is well on the way to digitising all existing paper-based publications, the book publishers will soon be in exactly the same boat.

If publishers don't adjust and revise their pricing to reflect the much lower cost that illegal copying represents to their digital volumes, they will lose enormous amounts of money to piracy.

Right now, copying a best-seller would involve hours of photocopying or scanning -- something that's just not worth it for a $20 book. Once that book is already available in digital format, copying will be as simple as firing up suitable software to circumvent the DRM and then write the resulting "ripped" file to disk. Once "freed", it would only be a matter of minutes before such books became available for "free" download on various P2P networks -- as much music and video content already is.

So how can publishers compete?

Well they could simply refuse to release works in digital format -- but that would be like the music industry sticking with cassette tapes and the movie industry sticking with 16mm film. It simply would not work because people demand the many benfits that digital media provides.

The reality is that if book publishing is to survive as an industry, they have to significantly drop their prices -- to the point that it remains more attractive to buy than to steal.

Given that "going digital" will save the publishers a huge amount of money by eliminating printing and transport costs, while also allowing them to sell directly to the customer, I would expect to see the price of a digital-edition to be a tiny fraction the paper-edition commands.

I also suspect we'll see a shift away from the "buy this book" model towards commercially operated libraries where, for a fixed monthly or annual fee, you can borrow digital editions of any book you want.

Providing the price is low enough and the process is simple enough, this model may be the single greatest hope the industry has for its continued survival.

People value their time and if they can simply log into their e-library then search, using friendly software, for the title they want and download onto their e-book reader all for a fixed flat-rate fee then few will bother with the hassles of P2P and the legal, logistical and security risks such systems pose.

e-Book readers with wireless connectivity would then be the fantastic equivalent of a doorway to the world's largest library of book titles.

It now becomes very simple to see where Google is headed with its service doesn't it?

Clever, very clever!

Friday, October 9, 2009

The online threat level keeps rising

If ever there was a time for individual net-users, system administrators and those who set IT policies within larger entities to be vigilant and pro-active, now is it.

This week alone has seen a wave of vulnerabilities, alerts and news reports that show just what a hostile place the online world has become.

As well as the surprisingly little-publicised browser vulnerability that affects the safety and security of SSL connections, the wires have been flooded with stories about the ease with which phishers managed to secure the login IDs and passwords of Yahoo, Hotmail and GMail users.

Then there was word of a new critical vulnerability in the ubiquitous Adobe PDF reader. Using a suitably constructed document file, hackers can gain control of a user's computer -- enabling all sorts of nefarious new code to be uploaded so as to turn that machine into a spambot or to install trojans, keyloggers or other malware.

One of the astounding revelations that has come from the Hotmail phishing exploits is the weakness of passwords some people have chosen.

According to reports, the most common password was "123456" (or some other run of sequential digits) -- something that would send a shiver down the spine of any half-decent system admin or security consultant.

Is this recent hike in hacks, vulnerabilities and successful phishing exploits just an anomaly or does it point to a dangerously poor understanding of basic security procedures within the online community?

There is some good news however. A lack of funds now need not be a hurdle to equipping your PC(s) with anti-virus/anti-malware software. Microsoft's new security suite has been given fairly positive reviews. On the other hand, despite planning to release a bevy of patches next week, the software giant still hasn't done anything about the gaping hole in its SSL code.

If this trend continues, and there's no reason to believe it won't, it really is time for all computer users to take a good long look at the measures they have in place to protect their systems and the data entrusted to them.

Basic commonsense such as using strong passwords that are changed at regular intervals is a great start. The installation and maintenance of reliable anti-virus/malware systems is also critical to security.

However, as software vendors continue to patch their holes, it's increasingly important to educate users as to the many different kinds of "human engineering" that criminals are now resorting to.

The fact that the wife of FBI director Robert Mueller has banned him from using the internet after a "close call" with a phishing scam is proof that we should never assume that anyone knows how to recognise and avoid such attacks.

Remember -- be careful out there.

Friday, October 2, 2009

A nuke-powered cellphone or laptop?

One thing is a given: that all our modern electronic devices require electrical energy to function.

While many of our computers and other hi-tech appliances are happy to hook up to the mains, almost everything (even that desktop PC) we use these days has a battery of some kind in it.

Way back in the dim dark days of electronics, portable devices were powered by stacks of small zinc-carbon cells. Even before the arrival of the transistor, portable electronic equipment would run (albeit not for long) on these simple batteries.

With the arrival of solid-state technology, the battery requirements of portable devices was significantly reduced. No longer was it necessary to have two batteries (one for the filaments and another for the hi-tension supply). A single 9V battery or assortment of AA, C or D-sized cells would suffice.

However, the chemistry inside these batteries was still predominantly zinc/carbon and they were primary cells (not rechargeable). The single largest cost of ownership with these old devices was the price of regular battery replacement.

Tired of the endless expense of primary cells, new forms of secondary (rechargeable) cells began to appear, the main one using a nickel-cadmium chemistry. Although cadmium is a toxic and environmentally harmful material, this was in an era long before such things were an issue so NiCads (as they became known) proliferated and took portable devices to a new level of convenience and cost-effectiveness.

Since then we've seen numerous advances in electronics and battery technologies until now we have very low-drain, large-scale, low-voltage, low-power CMOS integrated circuitry that consumes comparatively miniscule amounts of electricity compared to earlier gear.

The state of the art in battery technology is now based on lithium chemistry. Lithium-Ion, Lithium-polymer and Lithium-iron-phosphate cells are not only far less environmentally unfriendly but also offer much higher energy densities.

But where to from here?

While there has been much research in the area of supercapacitors which use nano-technology to store electrical energy as a surplus and deficit of electrons rather than through a chemical reaction, researchers confirm that this is still technology in its infancy.

Fuel cells have been promised for decades but, despite regular announcements, even from companies as highly respected as Toshiba, we still don't have those promised Vodka-powered laptops yet. Despite working well in the lab, there are clearly a raft of issues associated with large-scale manufacture of such fuel cells so don't hold your breath.

One idea that has been mooted in some circles recently is the nuclear battery.

Yes, that's right -- a small capsule of radioactive material that can generate electricity with no moving parts.

While this might sound like a giant step backwards and a potential source of hideous environmental danger, that might not be the case.

Thanks to the much lower power demands of many modern portable electronic devices, the amount of radioactive material required may be quite small-- little more than was once used regularly to create the luminous dials on clocks and wrist-watches back in the 1950s.

It's also worth nothing that more than a few of the unmanned spacecraft we've sent on missions far from earth have had nuclear batteries onboard. They're not a new concept but one that may see a revival if research is able to significantly reduce the amount of nuclear material required.

Or perhaps the nuclear battery will be a solution looking for a problem.

What manufacturer would want to try and sell a battery that lasts for decades rather than weeks? Unless you charge a fortune you'd soon saturate the market and sales would dry up.

And then there's the issue of fashion and the relentless advance of electronics technology.

Take the humble cellphone to see what I mean...

These days, people decommission or throw away perfectly good, fully functional mobile phones at regular intervals -- simply because a better, smaller, more powerful replacement comes on the market. What is the point in supplying a battery that lasts 10 years without a recharge when the item it is designed to power might have an effective life of just a year or so?

Nuclear batteries will re-appear but it's unlikely we'll see them in consumer electronics -- they just don't make commercial sense.

Likewise, you won't see cars powered by nuclear batteries -- that really would require too much radioactive material and definitely would pose an environmental/safety problem.

However, I would not dismiss the chance that nuke-powered watches and clocks may appear in the not too distant future -- no backlight required :-)

And let's be honest, wouldn't it be marvelous if you never had to worry about your cellphone or laptop battery going flat again?