Onfido is taking deliberate steps to eliminate bias in its facial recognition algorithm. The company is best known for its remote onboarding solution, which compares a selfie to the image on an official government ID to verify the identities of new customers.
As it stands, Onfido has an overall false acceptance rate of 0.01 percent. In plain terms, that means that there is only a 1 in 10,000 chance that the system will falsely match a selfie to a photo.
However, the figure is different for people of different backgrounds. Since Onfido’s customer base is primarily located in Europe, the algorithm was originally trained using a dataset that contained a disproportionate number of images of individuals with lighter skin. As a result, the algorithm has a 0.019 percent false acceptance rate for European nationalities and a 0.008 rate for people in the Americas, but a much higher 0.038 percent rate for those of African descent.
Onfido is now trying to correct that oversight, and is already training its algorithm to perform better for people with darker skin. The company indicated that the 0.038 percent rate represents a 60-fold performance improvement over the past year, and that it is still working with the UK’s Information Commissioner’s Office to further reduce that number.
While Onfido is not the only facial recognition provider that has struggled with racial bias, it is one of a smaller number that has publicly acknowledged the problem and taken proactive steps to address it. Major corporations like IBM and Microsoft have also worked to reduce bias, and while they have not been as forthcoming about their numbers, they have agreed to stop selling facial recognition technology to law enforcement agencies.
A recent NIST report found evidence of racial bias in many of the world’s leading facial recognition algorithms. That bias can have devastating consequences for people who are falsely identified, and underscores the need for balanced datasets in AI development.
Source: Axios
(Originally posted on FindBiometrics)
Follow Us