An Assistant Professor at Louisiana State University is researching two novel forms of hand-based phone authentication. Chen Wang is part of the department of Computer Science, and hopes to achieve commercialization with his two projects within the next three years.
The first of the two is designed to prevent unauthorized users from viewing sensitive information that might be displayed in a notification screen. When someone receives a notification, the phone usually makes a sound. Wang’s system will record that sound, and any differences in the tone that result from the person holding it. In that regard, Wang noted that everyone holds their phone differently due to factors like finger length and grip strength. Those factors dampen or refract the sound in unique ways, so the phone’s mic can learn those patterns and use them to verify the identity of the person holding it, in much the same way that a voice recognition would record someone’s unique voice print.
In practice, the phone will display the full notification if it is in the hands of its true owner. Otherwise, it will only display the number of notifications, hiding more sensitive details from onlookers, or from a friend or family member who just happens to be holding it. Wang is developing the technology alongside third-year PhD student Long Huang, who published a paper on the topic with Wang at Mobicom 2021.
The other technique is being developed with second-year PhD student Ruxin Wang and computer science master’s graduate Kailyn Maiden, and is similar to palm-based identification systems. The difference is that Chen Wang’s system scans the back of the hand rather than the palm. The system is specifically meant to be used when someone is holding up their phone to interact with another device, such as a kiosk or a payment terminal. If that device is equipped with a camera, it can scan the back of the hand that is holding the phone, and analyze characteristics like grip shape and skin color to match that hand to a registered print and verify that person’s identity. In doing so, it seamlessly adds a second authentication factor that can strengthen QR code or token-based security systems.
Wang, Wang, and Maiden are planning to publish a paper on the new modality at 2022 ACM CHI Conference on Human Factors in Computing Systems. The researchers on both projects are also taking steps to improve the accuracy of the two systems, and to ensure that they are resistant to spoofing attacks carried out with fake hands or acoustic replay.
Aerendir, meanwhile, already authenticates users based on the way they hold their phone, though the company’s solution relies on proprioceptive biometrics rather than acoustics.
(Originally posted on FindBiometrics)
Follow Us