Face it. Interfacing with your computer hasn't improved much in the last 20 years.

We're still tied to the keyboard, mouse, monitor metaphor. Sure, the keyboards have media keys, gaming keys, light sensors and coffee maker attachments, the mice have laser tracking systems that the Dept of Defense would be proud of and the monitors are improving in resolution and latency to a point where the next breakthrough will probably be predictive pixel colouring...

But the GUI metaphor is pretty stagnant and common across most of the major platforms. 3D spinning windows are a cool way to waste CPU cycles, but will they really help people interface with their computer?

Hope is at hand thanks to a number of innovative projects like BumpTop but it needs to be more than just a software solution - it has to be something which addresses the logical and physical interaction with the system in a way that transcends the current object / pointer interaction.

Example of the Multi-Touch Interaction desktopJeff Han, a research scientist at the New York University has started to address the project in a practical, real-world way. The first tangible fruits of the Multi-Touch Interaction Research project, quite literally, have to be seen to be believed...