Facial Recognition Will Be Watching and Storing Your Emotions and Data

The tech can identify fear and other feelings you might want to otherwise hide

Inspiration meets innovation at Brandweek, the ultimate marketing experience. Join industry luminaries, rising talent and strategic experts in Phoenix, Arizona this September 23–26 to assess challenges, develop solutions and create new pathways for growth. Register early to save.

Editor’s note: Industry consultant Shelly Palmer is taking his popular newsletter and turning it into an Adweek article once per week in an ongoing column titled “Think About This.”

Amazon says its Rekognition facial recognition software can now identify fear along with seven other emotions including, happy, sad, angry, surprised, disgusted, calm and confused. What Amazon is not telling you, though, is that facial recognition, when combined with other data, will be able to take a pretty good guess about lying, cheating, jealousy and other emotions that you do your best to hide with your “poker face.” Lie detectors are so last century.

Do you think they liked the movie? is no longer a valid question. What part of the movie bored them? is a valid question, answerable with the facial recognition data set on a frame by frame basis.

Amazon doesn’t need facial recognition to recognize how scary this is. The rest of us, however, should all be scared out of our wits.

Authentication versus identification

Two popular use cases for facial recognition are authentication (using facial recognition to unlock your phone, computer or house) and identification (hunting for someone in a crowd of people).

Using facial recognition for authentication is relatively safe because, generally, the data is stored locally on your device and not shared with anyone. Identification, on the other hand, compares your face with a shared database, and in the process, your image is generally added to a shared database.

For example, as of early 2018, Chinese police had installed more than 170 million cameras across the country, a network so efficient and effective that it was used to successfully identify and arrest a BBC reporter within seven minutes of adding his headshot to their facial recognition database.

Fight for the Future created a map that highlights how local and state police use the technology, in airports and other public spaces. While a handful of cities have cracked down on facial recognition, the number of cities that have embraced it far outnumbers them. As more states (including Texas, Florida and Illinois) allow the FBI to use facial recognition software to scan through their DMV databases, the battle for privacy will continue to rage as “state departments of motor vehicles databases [turn] into the bedrock of an unprecedented surveillance infrastructure.”

U.S. Customs and Border Protection (CBP) plans to have face scanners installed at every U.S. airport by 2022. The Transportation Security Administration (TSA) is testing similar devices at security check-in lines. More than 117 million Americans (roughly half of all U.S. adults) have their photo in a law enforcement facial recognition database.

So easy anyone can do it

If you have some engineering skills, you can easily use Amazon Rekognition to recognize sentiment as well as unsafe content in your images or videos. It can also recognize thousands of celebrities across a number of categories, such as politics, sports, business, entertainment and media. And it is at a very low cost.

Amazon’s new use case takes Rekognition to a new level. Now you can now use the software to take a pretty good guess at what a person in an image or a video is feeling. What could possibly go wrong?

To say that algorithms could understand what we are feeling before we are conscious of having the feeling is to understate where this is going. The scary part is this technology will learn to understand what we feel and will be able to predict it with uncanny accuracy.

The potential ramifications of this simple association of action and reaction are extraordinary. If I know when you’re feeling a certain way, I truly have the power to “play” you. The information asymmetry we enjoy by “staying quiet” disappears, and each of us will become the proverbial “open book.”

Amazon and others

I don’t want to single out Amazon. Google, Facebook, many big tech companies and hundreds of startups are focused on facial recognition and its benefits. But do the benefits outweigh the risks?

In May 2018, the ACLU cited concerns about a lack of oversight around Amazon’s Rekognition. The civil rights group issued a letter “calling out the potential for abuse of the system among law enforcement” and asked Amazon to stop selling it to government agencies.

In the year since that letter was issued, Amazon has doubled down. The company told police that it had partnered with 200 law enforcement agencies, and it has “filed for facial recognition technology patents that could identify ‘suspicious’ people.”

How do you feel about facial recognition software being combined with the enriched data profiles owned by a small number of data-rich companies (or local, state and federal agencies)? If the Rekognition algorithm were looking at me right now, it wouldn’t see fear; it would see abject terror.