-
Notifications
You must be signed in to change notification settings - Fork 9
About
Facial recognition is everywhere in modern technology, whether we are aware of it or not. It has been used to solve crimes, to track down missing children, and more. FR has become an efficient tool that we rely on for our population’s well-being. Because of this, the bias present in the technology has only followed in its growth in popularity, creating a dire need for resolution. If this bias continues to be present, FR technology will never be fully accurate and will always have flaws in its identification process. We must tackle two major issues to solve this issue: one being the lack of a publicly available datasets that are unbiased. Another is that each and every form of FR is unique, meaning there needs to be one form created so that there are no varying flaws from software to software.
Our findings reveal a bias in scoring sensitivity across different subgroups when verifying the identity of a subject using facial images. In other words, the performance of a FR system on different subgroups (e.g., male vs female, asian vs black) typically depends on a global threshold (i.e., decision boundary on scores or distances to determine whether true or false pair). Our work uses fundamental signal detection theory to show that the use of a single, global threshold causes a skew in performance ratings across different subgroups. For this, we demonstrate that subgroup-specific thresholds are optimal in terms of overall performance and balance across subgroups.
Furthermore, we built and released the facial image dataset needed to address bias from this view of FR. Namely, Bias Faces in the Wild (BFW).
- A unique publicly available dataset that can be used for training.
- A dashboard employed to find bias in researcher's FR models.
- Reveal the bias present in FR technology
- Help in eliminating said bias by creating a dashboard that can evaluate it
- Allow for the public use of the Balanced Faces in the Wild (BFW) database that equally represents each ethnicity and gender