In amongst the deepest, darkest depths of the code for Google Glass's Android companion app, a Reddit user found numerous lines of code that hint at some interesting eye gestures for the high-tech specs.
Reddit user Fodawim found the section of code, which can be seen below:
Using the title "EYE_GESTURES_WINK", two lines denote the device's ability to enable and disable eye gestures, whilst the further lines deal with the calibration of the feature and the command to take a photo with Glass' inbuilt camera.
Glass has a proximity detector on the inner side of the frame, which most attributed to detect if the user was wearing the device or not. More enthusiastic early-adopters of Glass have, however, suggested the sensor could be calibrated to detect a wink.
Another hint to the use of eye gestures is Google's patent that covers eye-tracking:
Methods and systems for unlocking a screen using eye tracking information are described. A computing system may include a display screen. The computing system may be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the computing system. The user may attempt to unlock the screen. The computing system may generate a display of a moving object on the display screen of the computing system. An eye tracking system may be coupled to the computing system. The eye tracking system may track eye movement of the user. The computing system may determine that a path associated with the eye movement of the user substantially matches a path associated with the moving object on the display and switch to be in an unlocked mode of operation including unlocking the screen.
It is uncertain whether eye tracking will be possible in the current iteration of Glass or will be added in future versions, but it's an interesting and potentially useful concept nonetheless.
Source: Ars Technica | Image via Google