Connect with us

Technology

Rekognition: Amazon Dismisses Claims Of Racial, Gender Bias

Published

on

at

Amazon has described as “misleading” a study labeling its facial-recognition tool, Rekognition, racial and gender biased.

The study was published by the Massachusetts Institute of Technology.

The researchers compared tools from five companies, including Microsoft and IBM.

While none was 100% accurate, it found that Amazon‘s Rekognition tool performed the worst when it came to recognizing women with darker skin.

The research found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin.

This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.

By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time.

The tools work by offering a probability score that they are correct in their assumption.

Facial-recognition tools are trained on huge datasets of hundreds of thousands of images.

But there is concern that many of these datasets are not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces.



George Oshogwe Ogbolu is a Digital Media Strategist | Content Writer | Journalist | New Media Influencer | Proofreader and Editor at Naija News.