Wednesday, May 21, 2008

Editorial: The Future of the Interface

Currently I am working on a report that summarizes what new directions interfaces are going in the future. I wanted to write a quick update about where I am at so far and what interesting information has come up. By the way, I love bullet points, but I'll get to that in another post.
  • Interfaces as we know them will be re-imagined
  • Pioneers will be young engineers with crazy ideas
  • Interfaces will interact with the user
Interfaces as we know them will be re-imagined
One thing that you can always count on is for new, innovative technologies to replace old ones. Innovation has changed the landscape of almost every industry in the world. Innovation often changes the players in the industry because new players simply think differently. Old-timers in an industry are generally stuck in a way of thinking.

Pioneers will be young engineers with crazy ideas
It happened with Microsoft, HP, Google, and almost every other "eBusiness" or tech company. But the fascinating thing is that interfaces that get picked up will be something different than what we think of. We think of using a mouse, then a natural progression is using a "finger" as a mouse, such as MultiTouch. Of course, MultiTouch is a great interface for many applications, but is that all that is new? Engineers will be inventing things we haven't even though of. I could list gesture interfaces, hologram interfaces, and all kinds of crazy things. Of course, if an engineer is inventing something I haven't thought of it makes it very difficult to mention it here.

Interfaces will interact with the user
Currently, most interfaces are static. How have they interacted with the user in the past? Keyboard do offer a tactile feedback and the best example of tactile is ForceFeedback found on joysticks. But do interfaces actually interact with the user? Not really. Interfaces in the future will evolve with use. They will learn how you use the computer. They will help determine optimal placement of each menu item, each key, and each button. They will determine which gestures work best for you.

The last point, that interfaces will interact with the user, is the most interesting to me. I imagine a world where a computer analyzes how you input commands and then optimizes itself to make it easier for you to use. Wouldn't that be amazing?