Representative John Lewis of Georgia and Representative Bobby L. Rush of Illinois are both Democrats, members of the Congressional Black Caucus and civil rights leaders.
But facial recognition technology made by Amazon, which is being used by some police departments and other organizations, incorrectly matched the lawmakers with people who had been arrested for a crime, the American Civil Liberties Union reported on Thursday morning.
The errors emerged as part of a larger test in which the civil liberties group used Amazon’s facial software to compare the photos of all federal lawmakers against a database of 25,000 publicly available mug shots. In the test, the Amazon technology incorrectly matched 28 members of Congress with people who had been arrested, amounting to a 5 percent error rate among legislators. The test disproportionally misidentified African-American and Latino members of Congress as the people in mug shots.
“This test confirms that facial recognition is flawed, biased and dangerous,” said Jacob Snow, a technology and civil liberties lawyer with the A.C.L.U. of Northern California.
Nina Lindsey, an Amazon Web Services spokeswoman, said in a statement that the company’s customers had used its facial recognition technology for various beneficial purposes, including preventing human trafficking and reuniting missing children with their families. She added that the A.C.L.U. had used the company’s face-matching technology, called Amazon Rekognition, differently during its test than the company recommended for law enforcement customers.
For one thing, she said, police departments do not typically use the software to make fully autonomous decisions about people’s identities. “It is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment,” Ms. Lindsey said in the statement.
She also noted that the A.C.L.U had used the system’s default setting for matches, called a “confidence threshold,” of 80 percent. That means the group counted any face matches the system proposed that had a similarity score of 80 percent or more. Amazon itself uses the same percentage in one facial recognition example on its site describing successfully matching an employee’s faces with their work I.D. badge. But Ms. Lindsey said that Amazon recommended that police departments use a much higher similarity score — 95 percent — to reduce the likelihood of erroneous matches.
Facial recognition — a technology that can be used to identify unknown people in photos or videos without their knowledge or permission — is fast becoming a top target for privacy experts.
Proponents see it as a useful tool that can help identify criminals. It was recently used to identify the man arrested for the deadly shooting at the The Capital Gazette’s newsroom in Annapolis, Md.
But civil liberties groups view it as a surveillance system that can inhibit people’s ability to participate in political protests or go about their lives anonymously.
Over the last two months, Amazon has come under increasing pressure for selling its facial technology, called Rekognition, to law enforcement agencies. The company has sold the service as a way for police departments to easily identify suspects in photos or videos.
Amazon’s site describes how its system can perform “real-time face recognition across tens of millions of faces” and detect “up to 100 faces in challenging crowded photos.” (The New York Times recently used the Amazon technology to help identify guests at the royal wedding of Prince Harry and Meghan Markle.)
In May, two dozen civil liberties groups, led by the A.C.L.U., wrote a letter to the Amazon chief executive, Jeff Bezos, demanding that his company stop selling the facial technology to law enforcement. The groups warned that the software could be used to trail protesters, undocumented immigrants or other members of the public — not just criminal suspects.
Mr. Snow of the A.C.L.U. said that the incorrect matches of the lawmakers should push Congress to put a moratorium on law enforcement’s use of facial recognition technology.
But in a blog post last month, Matt Wood, general manager of artificial intelligence at Amazon Web Services, said that there had been no reports of law enforcement abuse of Amazon’s facial technology. He added that Amazon believed it was “the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future.”
In a letter to Amazon from the Congressional Black Caucus, the group’s members noted the potential for racial bias with the technology — an issue raised by a recent M.I.T. study that found some commercial facial recognition systems correctly identified a higher proportion of white men than darker-skinned women. In their letter, members of the Congressional Black Caucus urged Mr. Bezos to hire “more lawyers, engineers and data scientists of color to assist in properly calibrating this technology to account for racial bias that can lead to inaccuracies with potentially devastating outcomes.”
In the civil liberties group’s test, the Amazon software misidentified several members of the Congressional Black Caucus as other people who had been arrested, including Mr. Lewis and Mr. Rush.
“We think these test results really raise the concern that facial recognition has a race problem,” said Mr. Snow, the A.C.L.U. lawyer.