A new study says Amazon‘s facial-detection technology often misidentifies women as men, particularly when they have darker skin.
The researchers from MIT and the University of Toronto say they studied Amazon’s technology because it has marketed it to law enforcement. Privacy and civil rights advocates say Amazon should not do so because of worries about discrimination against minorities.
Amazon says the study uses a “facial analysis” and not “facial recognition” technology and that Amazon has updated its technology since the study.
But MIT Media Lab researcher Joy Buolamwini says companies should check all systems that analyze human faces for bias. She adds that if a company sells one system that has bias, “it is doubtful your other face-based products are also completely bias free.”