Friday, December 18, 2009

Is the man/machine interface stalled?

The first man-machine interface between computers and their operators was a row of flashing lights and an array of switches.

It was simple and, for the time, relatively effective.

However, we've come a long way since those early days and their crude interfaces.

Today we've replaced the flashing lights with cool flat-screen LCD displays and the switches have been usurped by the ubiquitous QWERTY keyboard and mouse.

So where to from here?

What's the next big step in making the interface between computers and their users even better?

Well for a long time it appeared as if voice recognition would be a big and important feature for future computers. That belief seems to have fallen by the wayside however, and even though such systems are now available, they're seldom used for anything other than directed telephone menus.

Even the long-awaited touch-screen seems to be a solution looking for a problem these days.

The technology is certainly available and at a very reasonable price -- but it simply hasn't been embraced by either manufacturers or users.

Why is that?

Well a recent survey of mobile phone users found that while touch screens are nice in theory, a surprisingly large number of people currently using the technology would prefer to go back to good old-fashioned keys.

The situation gets worse when you're in a desktop computing situation. Just try pointing at your screen a lot and you'll realise how heavy your arms get. Converting the upright screen into a desktop tablet that sits flat won't do much to help either. Then it will be taking up valuable real-estate and postural issues which can cause neck-fatigue raise their ugly head.

More exciting, and even less practical avenues for man/machine interaction are being touted almost every day.

There are those who believe that contact lenses with built in LEDs will provide "heads-up" display capabilities, allowing computer users to see the output of their machines without even needing a separate screen.

Successful tests have also been carried out with direct neural stimulation which allows the computer to totally bypass your eyes and deliver its output straight into your brain's optical region.

In the other direction, great advances are being made in the area of using thought to control computers. Now while this might be good news for those with physical disabilities, it's not likely to be useful at the office or while gaming at home on your PC.

No, it would appear very much as if we're stuck with our flat-panel screens, keyboards and mice for quite some time to come.

And you know, unless you welcome the prospect of plugging your PC directly into your brain, that's not such a bad thing.

Friday, December 11, 2009

Florist's Google Hackfest

An interesting story surfaced this week in which it was shown just how vulnerable many NZ businesses may be to a form of commercial sabotage performed with the aid of the internet.

According to reports, a Napier florist was able to alter the business details of her competitors as they appeared on the Google Maps service.

The ease with which this sabotage was performed actually beggars belief and represents a sage warning for all those who (whether they know it or not) have a presence on this increasingly useful Google service.

By simply creating a new Google user account, the allegedly fraudulent florist was able to claim ownership of her competitors' details and alter them -- thus derailing attempts by web-surfers to contact them.

Google has warned any and everyone whose businesses appear on Google Maps to make sure they claim and check their listings for accuracy. Once this is done, only those with the correct account and authentication can update the details of a specific business.

Just for fun, I scoped out many of the businesses that are listed for the town where I live and found that although only a small percentage of them had an entry, far too many of those entries were available to be edited by a third party.

And when you check your listing be very careful to look for subtle changes -- such as a single digit incorrect in your phone-number or PO box. The canny saboteur will ensure that their changes are small enough to be overlooked by people in a hurry.

If your business isn't already listed in Google's Local Business Center then you're missing out on some valuable free publicity and promotion so you might want to spend a few minutes and ensure you're available to be found by users of Google Maps.

It always amazes me how many people spend a fortune on advertising yet overlook the massive amount of free promotion you can get online simply by asking.

Yes, it's never been easier to promote yourself to a eager market for free -- but it's also never been easier for a competitor to ankle-tap your hard work -- if you're not vigilant and aware.

If you can't trust a florist, who can you trust?

Friday, December 4, 2009

Core blimey!

If you've been upgrading your PCs regularly, chances are that the machine you're using now is either dual or quad core.

This means there are two or four processors built on the same piece of silicon and, due to their very close proximity and integration, it's possible for suitable software to divide up complex tasks so that parts of the code run concurrently on each core.

When the practical speed limits of switching and bandwidth are reached in any generation of computer technology, the only way to improve throughput is to simply "divide and conquer" by sharing the code between these multiple cores.

A suitable real-world analogy is that when you increase the flow of traffic on a roadway between two cities you can increase the speed at which they travel (but only up to a point) or simply add more lanes to the roadway. CPU gigahertz are the equivalent of the speed limit, cores are the equivalent of extra lanes.

So you'd think that after quad core there'd be eight core and then possibly even sixteen-core processors turning up next.

Well, as seems so often to be the case in the computer world, Intel hasn't bothered just doubling the core-count, they've gone straight to a 48-core processor they're calling the Single-chip Cloud Computer (SCC).

In order to create all these cores on a single chip, over 1.3 billion individual transistors are etched onto the surface of the silicon from which the CPU is formed.

Back in the 1960s, a single transistor cost around US$6. Even if a computer of this scale could have been created from individual transistors back then , the price tag would have been $8 billion. Although Intel hasn't announced a price for the SCC, which is due to be released late in 2010, the odds are that it will be well under US$1,000 even in small quantities.

However, Intel aren't resting on their laurels and claim that they'll soon be able to produce a chip with 100 cores. The current "core-count" record however, must go to the graphics chip manufacturer Nvidia who have announced their intention to build a 512-core chip.

The real question is -- what are users going to do with all this grunt and where will the software come from to take avantage of the enormous level of parallel computing power these new devices will represent.

Designing software for a dual or quad core processor is simple enough -- given that many applications can easily benefit from being spit into several concurrently executing tasks. Harnessing the power of 48 cores however, is a whole lot more complex.

The reality is that in a typical desktop situation, most of the 48 cores would simply be idling -- waiting around for something to do.

The real application for one of these uber-powerful processors will be in the areas of image/video processing, cryptography, virtual modeling and cryptography. Nvidia's 512-core chip for example, will clearly be designed to provide lightning fast graphics processing for its range of display cards.

However, when it comes to general-purpose processors such as the one in your desktop PC, it's only where large chunks of data can be broken up into smaller segments that can all be processed at the same time that the cores of the SCC will be kept running at full speed.

For this reason, your average Windows-based office PC will proably stick with just a handful of cores for the forseeable future. Both Intel and AMD have announced more modest 6-core processors for next year's desktop computers.

Never the less, Microsoft has stated that they're already planning to produce code that utilises the extra processing power that the SCC offers.

Until such time as newer designs (based onspintronics, optical or quantum technologies) make it out of the laboratory and onto the shelves, adding more cores is the easiest way to get more grunt from your boxes so don't be surprised if we see "core-count" being touted as a major selling feature, even if you don't need it.