Newly Released Apple Patent Uses Head and Eye Tracking To Adjust UI
A newly released patent application from Apple shows they may be looking to use facial recognition, eye tracking, and head tracking, in addition their existing sensors (gyroscopes, accelerometers, etc.) to adjust the way users see and interact with the user interface on their devices.
What exactly would this let Apple do for the next generation of the “I” devices? Essentially, this patent would enable Apple to include a user interface that is (or at least appears to be) entirely 3D. By tracking the position of the user’s face and eyes using a front facing camera, in conjunction with the other data from the device’s onboard sensors, your next IPhone could have the ability to show all of your icons as 3D objects that reorient themselves based on where you in relation to the phone. In addition to the user interface applications, this technology could be applied inside of applications, such as IBooks, so you no longer have to struggle to find the perfect angle to hold your IPad when you try to read in bed. Pretty cool, huh?
One of the other interesting things in this patent application is the indication that they also will be taking external lighting conditions into consideration while doing all of the head and eye tracking, which is something that has remote eye trackers have tended to struggle with for years.
So what do you think, is this something you would like to see? Do you think it will be widely adopted by users or will they just dismiss it as a gimmicky feature? Let us know in the comments section below!