Envision AI, the smartphone app that helps visually impaired people “see” the world around them by describing objects to them using artificial intelligence, has announced the debut of Envision Glasses, an AI-powered version of Google Glass that will bring its platform to Google’s wearable device.
Wearers will hear the smart glasses speak out loud to them, identifying and describing what is in their environment as they turn their head, making for a potentially more convenient form factor as opposed to having to hold up a smartphone’s camera and use it to identify signs, people and objects.
The software can identify words in 60 different languages, and will read aloud everything from printed materials — regardless of the writing surface — to computer screens and human handwriting. What’s more, its computer vision technology can even recognize faces and describe scenes.
These features could combine to help those who are visually impaired interact with family and friends, navigate around their homes, and independently use public transportation.
Users also have the option of having the dictation spoken directly into their ear as opposed through the speaker, as the Glasses are compatible with Bluetooth wireless headphones.
Aside from the marriage between Envision AI’s software and Google Glass Enterprise Edition 2, the core hardware of the device is unchanged, with eight-hour battery life, Wi-Fi and Bluetooth wireless capabilities, and USB-C connectivity. The screen is still present, however Envision considers it “redundant for [their] use case”, as users will rely largely on the camera, speaker, touchpad and processor for their needs.
The Envision Glasses are available today for preorder for $1,699 for “super early bird” buyers. The regular retail price will be $2,099 and the units are expected to start shipping to the first customers in August of 2020.