Developed by Google, the Look to Speak app uses the phone’s camera to “translate” eye movements into touch commands on the phone’s screen.
Intended to assist people with motor disabilities in basic communication with those around them, the Look to Speak application allows the selection of predefined phrases and expressions using only eye movements. The application includes a selection of generic phrases that can be edited and customized for each user.
Google Look to Speak
The first time you need to go through a short setup step to ensure proper operation – you need to place the phone just below eye level and adjust the tracking sensitivity.
Google provides a guide with instructions and video explanations to help you use the application as effectively as possible.
The Look to Speak interface model could be used for other purposes
According to the team of developers, the system created could also be used for more sophisticated interfaces, not necessarily addressed to people with locomotor deficiencies. For example, as a complementary method of interacting with the on-board MediaNav interface of cars, allowing menu navigation and activation of certain functions without taking your hands off the wheel or initiating voice commands.
The application is available for free and works with any Android phone, starting with version 9.0 (including Android One).