Facial recognition: the Defender of Rights fears “an unparalleled potential for discrimination”

In a report, the independent administrative authority highlights the risks associated with biometric technologies. Like the European CNILs, it opposes the deployment of cameras to recognize individuals in public spaces.

A new voice joins the concert of critics against facial recognition. The Defender of Rights published Tuesday, July 20 a report devoted to biometrics in which the public institution calls for precautions to be taken with these controversial systems, beyond the issue of privacy protection.

While the French government had wished to test these cameras for recognizing individuals in the public space before rejecting its project, the independent administrative authority responsible for the protection of the rights of individuals warns about the “unparalleled potential for amplification and automation of discrimination ”of these innovations. The technology is also the subject of discussions in Brussels, where the European Commission wishes to authorize them on exception.

Risks even at 100% reliability

A year ago, the public institution had already examined with the CNIL the problem of errors and algorithmic biases at the origin of discrimination. “In the case of facial recognition, the consequences of these errors can range from the refusal to access a place, to an unjustified police arrest”, points out Claire Hédon, the Defender of Rights. Deployed in public space, these technologies could make hundreds of mistakes, the report points out, as examples in the UK have shown. “Between communication and reality, there is a chasm,” recalls specialist Laurent Mucchielli.

But the document also goes further by highlighting the risks inherent in facial recognition, however reliable it may be. Even with a 100% accuracy rate in the identification of a person or, as it begins to be seen, of the emotions on a face, the Defender of Rights fears that the decisions taken in a second step by a human or a another algorithm only reinforces the discriminations.

The institution particularly fears the deployment of these systems in geographic areas where populations are already subject to identity checks much more than elsewhere, particularly where “young men perceived as belonging to minorities” are over-represented.

Waiver of rights

In addition, the deployment of these cameras in public space could, according to Claire Hédon, lead citizens to renounce certain rights, in particular that of demonstrating if they know that they can be recognized. Some particularly fragile populations, for example undocumented immigrants, could also lose their right to health if a camera scans them when approaching a hospital.

Inspired by the regulation on the protection of personal data, the Defender of Rights would like to impose on the designers and purchasers of these technologies an impact analysis including questions of discrimination. “The deployment of these technologies, because they can infringe the rights of citizens, cannot be done without strict guarantees, in particular of necessity and proportionality”, recalls Claire Hédon.

Trial in non-perfection

“With regard to the most intrusive uses, such as real-time remote biometric identification devices in public places, it appears difficult to conceive how the use of these systems could be considered necessary and proportionate to this day, given the significant risks of misuse that they represent ”, continues the report in line with the European CNILs.

This conclusion will undoubtedly not please companies in the biometrics sector, in particular world-class French champions like Idemia or Thales . “A trial in non-perfection is often done to technology when we identify a person from a database of faces in 99.7% of cases [in the laboratory, Editor’s note]”, defends Vincent Bouatou, strategic innovation manager for the public sector at Idemia. On the contrary, he calls for authorizing the deployment of facial recognition cameras in public spaces to further advance technology.

Leave a Reply

Your email address will not be published. Required fields are marked *