SAN FRANCISCO — Amazon's controversial facial recognition program, Rekognition, falsely identified 28 members of Congress during a test of the program by the American Civil Liberties Union, the civil rights group said Thursday.
In its test, the ACLU scanned photos of all members of Congress and had the system compare them with a public database of 25,000 mugshots.
The group used the default "confidence threshold" setting of 80 percent for Rekognition, meaning the test counted a face match at 80 percent certainty or more.
At that setting, the system misidentified 28 members of Congress, a disproportionate number of whom were people of color, tagging them instead as entirely different people who have been arrested for a crime.
The faces of members of Congress used in the test include Republicans and Democrats, men and women and legislators of all ages.
Amazon responded that when using facial recognition for law enforcement activities, it recommends setting the confidence threshold at 95 percent or higher.
A spokesperson from Amazon Web Services said in a statement the test results could have been improved by increasing the confidence threshold. While 80 percent is an acceptable threshold for photos of everyday items and objects, it's not appropriate for identifying individuals with a "reasonable level of certainty."
In its report on its findings, the ACLU said that the default setting for the program was 80 percent and that Amazon recommends that level for face-based user verification.
The tool is used for facial recognition in arenas outside of law enforcement. For example, during the royal wedding of Prince Harry and Meghan Markle in May, British broadcaster Sky News used Rekognition to help it identify celebrities as they entered Windsor Castle.
The software has also been used by Pinterest to match images, by stores to track people, to identify potentially unsafe or inappropriate content online and to find text in images.
Privacy and policing
Amazon has come under fire recently for selling the facial recognition service to law enforcement agencies because of concerns that it might be used to track people going about their daily lives, or at political protests or in other situations where most people now presume they are anonymous.
Because of these concerns, civil rights groups, privacy advocates and even some Amazon employees and shareholders have asked CEO Jeff Bezos to stop allowing police and federal agencies to use the facial recognition technology.
The results of the ACLU's test "demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance," wrote Jacob Snow, a technology and civil liberties attorney for the ACLU of Northern California.
Two years ago, Amazon built the facial and product recognition tool as a way for customers to quickly search a database of images and look for matches. Rekognition requires the user to have two sets of images. The first is generally a large database of known individuals. The user then submits images on individual which the software then compares with those in the large database to find what it believes are matches.
Snow said the product has been "aggressively" marketed to police. At least two agencies, one in Orlando, Florida, and one in Washington County, Oregon, are testing Rekognition currently.
The analyst in charge of Washington County's program says it would never rely on facial recognition software to so much as go up and talk to a potential suspect, much less arrest them.
The department doesn’t set a confidence threshold at all because all the decisions are made by humans, said Chris Adzima, the senior information systems analyst with the Washington County Sheriff’s Office in Hillsboro, Oregon.
“When we have an image from an active investigation, the investigator will put it into our system and the system will spit out the top five likely results. Then the investigator will look through those five to determine if any of those are possible leads,” he said.
Even then, the investigator has to do proper due diligence, running the name to see if the person had a record or known contact with potential victims.
While Adzima said Washington County has good success in using facial recognition to help identify people who were eventually tied to crimes, “almost none of them were under a 95 percent confidence threshold,” he said.
Facial recognition technology was successfully used to identify the man arrested for the shooting at theCapital Gazette newsroom in Baltimore, Maryland.
More: Amazon should stop selling facial recognition software to police, ACLU and other rights groups say
But the ACLU and other privacy advocates say the technology is an invasion of privacy. And they say it could be used to target and track immigrants or protesters.
In May, 34 civil rights groups sent a letter to Amazon CEO Jeff Bezos, saying people should be "free to walk down the street without being watched by the government."
That same month, members of the Congressional Black Caucus, also sent a letter to Bezos, which said they were troubled by the "profound negative unintended consequences" this technology could have for African Americans, undocumented immigrants and protesters.
"The race based 'blind spots' in artificial intelligence, especially those that are manifested in facial recognition technology, have been well documented," the letter said.
These "blind spots" in facial recognition AI include an incident in 2015 where a Google photo application identified pictures of African American users as "gorillas" and a study released earlier this year from the Massachusetts Institute of Technology that found facial recognition software, used to identify a person's gender, had an error rate of 0.8 percent for light-skinned men and 34.7 percent for dark-skinned women. The study used three different types of commercial facial recognition software.
Six members of the Congressional Black Caucus were misidentified in the ACLU's test of Rekognition.
Reps. Jimmy Gomez (D-Calif.) and John Lewis (D-Ga.), who were both falsely identified during the test, sent a letter to Bezos on Thursday asking to meet immediately to address the "defects" of the technology "in order to prevent inaccurate outcomes."
In a statement, Lewis said he found the results of the software "deeply troubling."
"As a society, we need technology to help resolve human problems, not to add to the mountain of injustices presently facing of people of color in this country," Lewis said. "Black and brown people are already unjustly targeted through a discriminatory sentencing system that has led to mass incarceration and devastated millions of families. The poor are already ensnared by the complications of a judiciary that leads the innocent to plead guilty because they can find no other way out."
Lewis said he has been a victim of misidentification and erroneous targeting before. "What would happen, under these already threatening conditions, if people from these same communities were misidentified by facial recognition software? How would they prove to police that a computerized result is false," he asked.
"It is not enough for Amazon to advise users of a 20 percent failure rate in their software," he said. "Law enforcement should not use this technology until the onerous civil rights and civil liberties issues are confronted and accuracy is guaranteed. If industry wants to engage in the public sphere, it needs to make the public good, not profit, a top priority. American families should not be collateral damage on the road to technological innovation."