Keep up to date with the latest Eye Tracking news and trends

Newly Released Apple Patent Uses Head and Eye Tracking To Adjust UI

Newly Released Apple Patent Uses Head and Eye Tracking To Adjust UIA newly released patent application from Apple shows they may be looking to use facial recognition, eye tracking, and head tracking, in addition their existing sensors (gyroscopes, accelerometers, etc.) to adjust the way users see and interact with the user interface on their devices.

What exactly would this let Apple do for the next generation of the “I” devices? Essentially, this patent would enable Apple to include a user interface that is (or at least appears to be) entirely 3D. By tracking the position of the user’s face and eyes using a front facing camera, in conjunction with the other data from the device’s onboard sensors, your next IPhone could have the ability to show all of your icons as 3D objects that reorient themselves based on where you in relation to the phone. In addition to the user interface applications, this technology could be applied inside of applications, such as IBooks, so you no longer have to struggle to find the perfect angle to hold your IPad when you try to read in bed. Pretty cool, huh?

One of the other interesting things in this patent application is the indication that they also will be taking external lighting conditions into consideration while doing all of the head and eye tracking, which is something that has remote eye trackers have tended to struggle with for years.

So what do you think, is this something you would like to see? Do you think it will be widely adopted by users or will they just dismiss it as a gimmicky feature? Let us know in the comments section below!

  • Ihor Petelycky

    We already do something similar for viewing S3D content on laptop computers. The eye tracker is used to track both the viewers position in y and z. For z we track the interoccular distance. Whn the viewer moves outside the optimal viewing area the system displays the same view to both eyes, greatly reducing the probability of cross-talk which manifests itself as a ghosting effect.

  • Ihor Petelycky

    We already do something similar for viewing S3D content on laptop computers. The eye tracker is used to track both the viewers position in y and z. For z we track the interoccular distance. Whn the viewer moves outside the optimal viewing area the system displays the same view to both eyes, greatly reducing the probability of cross-talk which manifests itself as a ghosting effect.

  • Claudiamedic

    Amazing! I have disabilities and what is a gimmick to others is a life-changer for me!

  • Claudiamedic

    Amazing! I have disabilities and what is a gimmick to others is a life-changer for me!

  • stephen white

    Take the use of the author’s hands and voice away, and I guarantee they’ll change their attitude about the value of eye tracking hard/software for people like claudiamedic and myself, who face a potential for being “locked-in”, only able to move our eyes. Current eye tracking technology is for the most part bulky, clunky, slow and marked-up about 700%.  There should be a compact, light weight, open-source solution to enable us to continue interacting with the world.

  • Odessaguy

    Sounds terrific! I am totally paralyzed from the neck down and use a head mouse with an onscreen keyboard. I am looking forward to a good eye gaze system.