Sign In

Communications of the ACM

ACM TechNews

Study Outlines What Creates Racial Bias in Facial Recognition Technology

View as: Print Mobile App Share:

A recent study from two University of Texas at Dallas researchers and their two colleagues outlined the underlying factors that contribute to race-based deficits in facial recognition accuracy.

Credit: UT Dallas News Center

A study by Alice O'Toole and Jacqueline Cavazos of the University of Texas at Dallas (UT Dallas) named the underlying factors that engender racial bias in facial recognition technology.

O'Toole said different mechanisms contribute to biases; Cavazos categorized contributing factors as either data-driven (affecting the algorithm’s performance) or operationally defined (stemming from user input).

O'Toole noted a reduction in training image quality makes racial bias more pronounced, while Cavazos said operational bias can be introduced based on where the threshold is set between matching and nonmatching decisions, and on what types of paired images are selected.

Said O'Toole, "We have learned so much about the complexity of the problem that we have to acknowledge that there may never be a solution to the problem of making every face equally challenging to a face recognition algorithm."

From UT Dallas News Center
View Full Article


Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


No entries found