MIT Study Says Amazon’s Facial Analysis Software is Prone to Misidentify Darker-skinned Women
Amazon’s facial Rekognition software, a brand which is being tested in Orlando as a surveillance system, misidentifies darker-skinned women as men roughly a third of the time, according to a new MIT Media Lab study.
Amazon’s facial analysis software made no errors in identifying the faces of lighter-skinned men. But it misidentified the gender of lighter-skinned women about 7 percent of the time. Darker-skinned women were mistaken for men about 31 percent of the time.
Rekognition’s facial analysis tool, which senses face traits and expressions, is different from its facial recognition component, which actually identifies people by matching a photo to a predefined database of images.
The City of Orlando is currently testing Amazon’s facial recognition software in a manner unprecedented among American law enforcement agencies. When it becomes fully functional, the technology will plug into the city’s public street camera network and scan everyone around in an attempt to identify and track a person of interest in real-time.
Matt Wood, a machine-learning engineer at Amazon, slammed the MIT study, saying it only tested facial analysis and ignored facial recognition. When Amazon did its own version of the test with an up-to-date version of Rekognition and a similar data set of images, it says it found “exactly zero false positive matches with the recommended 99 [percent] confidence threshold.”
But Joy Buolamwini, co-author of the MIT study, says Amazon engages in a “denial, deflection and delay” approach in addressing calls for government regulation.
In a Medium post, Buolamwini wrote, “We cannot rely on Amazon to police itself or provide unregulated and unproven technology to police or government agencies.”
90.7 WMFE’s Silent Drive
You drive public radio. Give today to stand for fearless reporting, fair and civil conversation,
and in-depth news for all.