User Interface Pre-cog’s Predict End of the Mouse
We’ve likely all seen it: John Anderton, played by Tom Cruise, snaps on his information man-handling gloves and proceeds to literally manipulate the recorded premonitions (future memories?) served up by the pre-cog (not to be confused with a Happy Cog) trinity, deftly summoning those he desires without having to wait for search results, and swatting away those less useful.
Most of us reacted to it similarly, I imagine: “Whoa…”. Sure, it was impressive. Especially for 2002. But between it and Spielberg’s sideways and vertically driven cars speeding up and down buildings, our suspension of disbelief was pulled taut. To near failing, at least in my case. If not so entertained, it might have snapped. And though its affect was dramatic (according to its purpose), is this a practical human-computer interface design solution, or just an accidental preview to Wii Fit?
“Imagination is more important than knowledge.”
—Albert Einstein, Genius
Authors have traditionally been our greatest “imagineers,” often leading us where no one else has gone before, extending the limits of our collective imagination. Some ideas even catch on. You have Captin Kirk’s communicator in your pocket, afterall. And you can watch Star Trek via satellite, tonight, thanks to Arthur C. Clarke’s great powers of cognition. So will you soon be rushing to Fry’s to get Philip K. Dick’s giant transparent monitor and information-aware gloves? Perhaps. People are working on it, I suppose you know.
With all the emphasis on ubiquity, immersion, gestural interfaces, and attention on 3-D interfaces and operating systems, not to mention 3D browsers, add some very non-traditional controllers and touchpads, one just has to ask, is the mouse WIMPing out?
From some SXSW Interactive presentations, you might think the mouse has somehow failed us. Okay, I’m exaggerating. And yes, I know as well as the next die hard Star Wars fan that innovating for fun and profit is, well, fun and profitable. And, to be honest, this isn’t the path I’d planned for this post. But let’s keep it real here. It’s reality check time.
Mice and windows will, I here predict, remain on a desktop very near you for many years to come. Necessity may no longer be the mother of invention, but a table top computer? I do not see the ‘wow’ factor in laying a touch screen monitor out flat. I’m sorry, Microsoft. Invention is a mother after all. And by the by, Outlook could really use some improvement… still.
“Forecasting 50 years ago was as good or as bad as it is today. And the reason is that human nature hasn’t changed. We can’t improve ourselves.”
—Alan Greenspan, Former Chairman, U.S. Federal Reserve
I really don’t want to rain on the go go gadget parade. But many of these human-machine interface design innovations are not being driven by people’s needs, or good old fashioned design thinking, they seem to be driven by economics. Okay, this is one need, sure. But not one that leads to useful, usable, desirable (not so be confused with seductive) and therefore successful products.
The point here is not that 3D interfaces have long been in the works already (it is not for lack of imagination that you’re not using one now). The point is that self-interested innovation will not move design forward any more than releasing new models every year has moved GM forward. In times such as these I hope any innovation supports our economic imperatives. But let’s not loose focus. For human-computer interface / product designs to succeed, they have to help us do something, do it better, or fulfill our needs differently in a particularly useful way. People make things successful, not technology. And we have not changed that much at all.
The UI for the Minority Report computer looked great in a movie but I think a full day using one would leave most mortals wrecked, not to mention constantly yelling: “hey! can you not move around on the other side of my monitor please…”
Still, the principle’s there in the iPad
I find it hard an article seriously when the author doesn’t know “deminse” from “demise” (an unusual name vs. a noun). Sigh.
Just a typo Jim. Like the one in your comment. 😉