Bias in biometric recognition: Challenges and Opportunities

Oct 18, 2022

Session Details

Bias in biometric recognition is challenge that has garnered significant attention in recent years. But what is it?

It is important to understand how biometric technology works in order to understand this. In order to make its decision, a biometric system compares an image, be it a face, fingerprint, or an iris, against a previously enrolled image. But sometimes an incorrect decision can be made. You might have experienced an error, known as a “false negative”, when your smartphone equipped with a fingerprint sensor or face camera failed to recognize you. The remedy is to repeat the attempt, or failing that, to use a back-up mechanism. However, there can be other errors, such as a “false positive”, when a biometric system incorrectly matches an attacker.

The question is, are these errors disproportionately associated with specific demographic groups? Often called “bias”, the issue is when one group experiences more errors than another. Studies have shown that some algorithms may, on an average, exhibit more errors for one demographic group compared to another. To distinguish this from societal bias or statistical bias, it has been called “differential performance” by researchers.

This talk will give an overview of the issue and the following questions. Why does this occur for some biometric algorithms and not others? How can it be mitigated? How to evaluate solutions