Facial recognition systems far from flawless

Facial recognition systems far from flawless
© Belga

Facial recognition systems are not always reliable, especially when it comes to non-whites, says a U.S. government report published on Friday.

Such systems wrongly identify Asians and African-Americans 100 times more often than whites, according to the study, which analysed dozens of algorithms.

Researchers at the government-affiliated National Institute of Standards and Technology (NIST) also found two algorithms that attributed the wrong gender to black women in 35% of cases.

Facial recognition is already widely used by authorities and law enforcers, at airports and banks, in businesses and in schools. It is also used to unlock some smartphones.

Many human rights advocates and researchers have been trying to block its further deployment. They feel algorithms make too many mistakes, innocent people could be sent to prison, and data bases risk being pirated and used by criminals.

The algorithms developed in the United States had higher levels of error for Asians, African Americans and American Indians, according to the study, while those designed in Asian countries were able to identify Asian and white faces equally well.

Research director Patrick Grother described this as encouraging since it showed that using a more diverse database allowed for better results.

However, the American Civil Liberties Union (ACLU) commented that what the study showed, above all, was that the technology is not ready and should not be installed.

Even government scientists confirm that this surveillance technology is flawed and biased, commented ACLU analyst Jay Stanley, warning that wrong identification can cause people to lose their flights or be wrongly included on lists of people to be watched. It can also create tensions with police forces and lead to wrongful arrest or worse.

But above all, whether or not the technology is reliable, it allows for the installation of undetectable and omnipresent surveillance at an unparalleled scale, he said.

The Brussels Times


Copyright © 2024 The Brussels Times. All Rights Reserved.