Keep up to date with the latest Eye Tracking news and trends

Rest in Peace, Mouse… part 2

Rest in Peace, Mouse... part 2

Yesterday I posted an article that I came across on Wired.com talking about new UI (user interface) trends that are overtaking the conventional mouse as the preferred mode of control.  I referred to the list that Wired posted with the article, but wanted to go back and explain what each of these actually were in a bit more detail, just in case anyone was curious.  

First, touchscreens.  These are pretty self-explanatory, especially now that iPhones have become so ubiquitous.  In a sense, touchscreens cut out the middle man – that middle man being the mouse.  A user controls directly what they see on the screen by touching and dragging.  The problem?  There are still a few bugs.  Though it sounds crass, fat fingers are a problem.  Touchscreens aren’t as precise as they could be, but there’s a lot of effort going into this field at the moment, so plan on more accuracy soon. 

Next is our favorite:  Eye-tracking.  Again, lots of effort going into this field with regards to design and usability testing diagnostics.  Eye-Com has its hands in a few uses – communication, medical solutions, and gaming control.   There are a few different ways we’re seeing eye-tracking being utilized, but the most prevalent are Tobii’s use of timed gaze (where a user stares at an area of the screen for a predetermined amount of time) and Eye-Com’s more immediate, blink-enacted control.  I mentioned yesterday, but it’s worth noting again that Wired predicts eye-tracking to be the most promising UI technology. 

Force-feedbacks.  Force-feedbacks refers to what is known as “haptic technology.”  Haptic technology is a UI technology that utilized a user’s sense of touch through the application of motion, vibrations, and texture.  Again, this is a feature that cuts out the middle man, so to speak (though all of the UI’s mentioned could be described as such), by bringing the user directly to the control of the PC.  There is much research being done on how touch and underlying brain functions work, so expect to see more of this in the near future as well. 

Well, that’s enough detail for one day.  Check back tomorrow when I go over the remaining three UI’s:  Accelerometer sensors, Voice commands, and Gesture recognition.

Related articles:

  1. Rest in Peace, Mouse… part 3
  2. Rest in Peace, Mouse.