Amazon Buckles to Pressure Over Police Use of Facial Recognition

Civil rights groups have long for a moratorium on Rekognition

Amazon is instituting a one-year moratorium on the use of its facial recognition technology by police. Getty Images
Headshot of Lisa Lacy

Key Insights:

As a national spotlight remains focused on police brutality, Amazon is implementing a one-year moratorium on the use of its facial recognition technology by police departments.

This marks a significant reversal for Amazon, which, until now, had not been swayed by calls from dozens of organizations like the American Civil Liberties Union not to sell the technology to law enforcement because people of color are disproportionately harmed by police practices. Amazon’s technology, known as Rekognition, they argue, could exacerbate the problem.

‘The most sophisticated, modern technology that exists’

A little over a year ago, for example, Amazon shareholders voted down a proposal that would have restricted the sale of Rekognition to government agencies. (Those shareholders also rejected a proposal to study the extent to which Rekognition may violate civil rights.)

Adweek reached out to Amazon about Rekognition following news from IBM earlier this week in which CEO Arvind Krishna told members of Congress that IBM no longer offers its general purpose facial recognition software because the company opposes its use for surveillance, racial profiling and violations of human rights. Amazon did not respond.

Andrew Jassy, CEO of Amazon Web Services, which oversees Rekognition, has, however, spoken to the PBS documentary series Frontline. In an interview that aired in February, he said Amazon believes police departments should be allowed to experiment with Rekognition because law enforcement should have access to “the most sophisticated, modern technology that exists.”

He noted that Amazon has never received a report of misuse by law enforcement, and he believed any abuse would be made public.

“We see almost everything in the media today, and I think you can’t go a month without seeing some kind of issue that somebody feels like they’ve been unfairly accused of something of some sort, so I have a feeling that if you see police departments abusing facial recognition technology, that will come out … It’s not exactly kept in the dark when people feel like they’ve been accused wrongly,” Jassy told Frontline.

What a difference a few months makes.

Like any burgeoning technology, where, when and how to use facial recognition is a complicated issue.

Facial recognition software has biases

First and foremost, multiple studies have demonstrated the technology is less capable of accurately identifying women and people of color.

In July 2018, for instance, the ACLU released a study in which Rekognition incorrectly matched 28 members of Congress with people in mugshots—and the false matches were disproportionately of people of color. (But it’s hardly the only study.)

At the time, an Amazon spokesperson pointed to uses that benefit society, such as preventing human trafficking and finding missing children, and said the ACLU test could have been improved by using a higher confidence threshold for matches, which is what it recommends for law enforcement. (Jassy repeated these talking points in his Frontline interview.)

The Amazon rep also noted Rekognition is “almost exclusively” used to narrow the field of possible suspects. But that’s not always the case with facial recognition. Look no further than the January investigation by The New York Times that found officers in Pinellas County, Fla.—which, to be clear, were using their own in-house database—sometimes used facial recognition as the basis for arrests when they had no other evidence.

Last February, Amazon said it was planning to work with the National Institute of Standards and Technology (NIST), the U.S. government lab with an industry benchmark for facial recognition, to develop standardized tests to remove bias and improve accuracy. Until that point, Amazon had not submitted Rekognition for testing alongside 76 other developers because it said its technology was too “sophisticated.”

Sixteen months later, however, a representative for NIST said Amazon still has not submitted an algorithm.

The public may not know police are watching

Another part of the problem is the public doesn’t always know when facial recognition is in use.

The city of Orlando, Fla. (which later dropped the technology) and the Washington County, Ore. Sheriff’s Office are among the few government agencies publicly identified as Amazon customers, but they certainly aren’t alone in using facial-recognition technology from Amazon or elsewhere.

"We’re glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities."
Nicole Ozer, ACLU


@lisalacy lisa.lacy@adweek.com Lisa Lacy is a senior writer at Adweek, where she focuses on retail and the growing reach of Amazon.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}