New Google Tech Makes Virtual User Interface Possible

New Google Tech Makes Virtual User Interface Possible

Google is breaking new ground with a small radar sensor that can precisely detect small movements such as hand and finger gestures. Called Soli, the technology is being developed by the company’s Advanced Technologies and Projects (ATAP) team.

The Soli device is square-shaped, smaller than a fingernail, and looks like a microchip, but it’s actually a radar, using radio waves to produce images – as many as 10,000 per second. In this way it can capture subtle hand movements to a fine degree. While it hasn’t yet been integrated into actual apps or devices, the possibilities are tantalizing, especially given its small size. It could be integrated into a mobile device, for example, in a system that would allow users to control the device via hand gestures without actually touching a screen, thereby creating a virtual user interface. And similar applications could be used in wearable clothing.

It’s a major step forward compared to the technology currently on the market. And as the device ecosystem of the Internet of Things proliferates, the user interfaces enabled by this kind of technology could prove extremely popular, especially for devices too small or otherwise inappropriate for traditional touchscreen and buttons.

Further innovations from the ATAP team such as the Jacquard project, which is developing wearable fabrics responsive to touch, and the Ara project investigating a modular smartphone concept – both of which could also prove to have a major impact in personal computing devices going forward.

Source: CTV News