acm-header
Sign In

Communications of the ACM

ACM TechNews

Algorithms Can Be More Fair Than Humans


View as: Print Mobile App Share:
How fast can it get there?

Amazon's recently rollout of same-day delivery in selected metropolitan areas demonstrated how computerized decision-making can also deliver a strong dose of discrimination.

Credit: Hadrian/Shutterstock.com

Although data-driven decisions can be biased and unfair, research from Bloomberg News suggests this same methodology may make it easier to identify such biases, writes University of Michigan professor HV Jagadish.

Bloomberg found Amazon's same-day delivery algorithms contained hidden biases that excluded poor urban areas from the retailer's service area. Jagadish contends this tendency among popular online retailers is of legitimate concern because of their rigidity and dominance, with rigidity defined by the business' decision of which ZIP codes are in its service area, and dominance referring to the retailers' precedence in consumers' minds.

"While their rigidity and dominance may cause us greater concern about online businesses, we also are better able to detect their discrimination than we are for brick-and-mortar shops," Jagadish says.

He also cites a recent ProPublica analysis of an algorithm that predicts a criminal's probability of recidivism that uncovered systematic bias, although race was excluded as a specific factor. In contrast, "any errors that result from bias in human judges' decisions are likely to be different among judges, and even for different decisions made by the same judge," Jagadish notes.

He says unlike with humans, it is easier to unearth conclusive proof an algorithm discriminates thanks to its rigidity and the large volume of factors it weighs.

From The Conversation
View Full Article

 

Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account