Amazon's face-detection tech is biased

About: Amazon
Date: January 27, 2019


TL;DR:

  • A new research demonstrated how Amazon’s face-detection technology shows bias;
  • The technology had higher error rates for dark-skinned people (31% error rates);
  • Deborah Raji and Joy Buolamwini conducted this research and are presenting this Monday;
  • Such huge error rates when handed over to Governments can be a “grave mistake”;
  • Misinterpretation of such data can be a huge mistake;
  • Amazon is expected to address this issue soon; no confirmation from them yet;
  • Read what Joy wrote on Medium as a response to this study here.

Learn more ▸

Did we miss anything important? Please let us know. This is a rapidly evolving product and your feedback is crucial;

Quiet by 4CAST

Our no noise newsletter. No fluff. Absolutely Pure content.