Rekognition: Amazon Dismisses Claims Of Racial, Gender Bias
Connect with us



Technology

Rekognition: Amazon Dismisses Claims Of Racial, Gender Bias

Published

Amazon has described as “misleading” a study labeling its facial-recognition tool, Rekognition, racial and gender biased.

The study was published by the Massachusetts Institute of Technology.

The researchers compared tools from five companies, including Microsoft and IBM.

While none was 100% accurate, it found that Amazon‘s Rekognition tool performed the worst when it came to recognizing women with darker skin.

The research found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin.

This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.

By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time.

The tools work by offering a probability score that they are correct in their assumption.

Facial-recognition tools are trained on huge datasets of hundreds of thousands of images.

But there is concern that many of these datasets are not sufficiently diverse to enable the algorithms to learn to correctly identify non-white faces.




Copyright Naija News 2019. All rights reserved. You may only share Naija News content using our sharing buttons. Send all news and press releases to newsroom@naijanews.com.

Ogbolu George is a graduate of Mass Communication from the University of Benin. He loves politics, is a movie addict and a die-hard Arsenal fan. George is a Senior Content Creator at Naija News.

Advertisement

Contact Us

Open

Close