How Brands Can Leverage Image Recognition Technology to Learn About Consumers

Some are already even doing so

Ever wonder how your iPhone recognizes your face and even attempts to recognize people in your photos?

Much like how a human brain processes images, a computer is trained to process images like a human would. Image recognition, a subset of computer vision, was born out of a summer research project in 1966. The original idea was to attach a camera to a computer and simply have it say what it saw. However, image recognition still needs to be trained by a human.

Fast forward to today, and image recognition is part of our everyday life and is being used brands and retailers to sell items. Tech giants such as Apple, eBay, Facebook, IBM and even Kim Kardashian continue to push the envelope in image recognition. Google’s advanced image search continues to learn and evolve, and both Apple and Facebook can identify a person using an uploaded image.

Facebook researchers and engineers trained image recognition networks with up to 3.5 billion Instagram images labeled with as many as 17,000 hashtags. A recent New York Times article revealed that AI click farms spend their days identifying images.

But how are brands and marketers using this nascent technology?

Brands using image recognition

EBay recently launched Image Search, which allows a user you to take a picture or select a photo from the device and conduct an eBay item search based on the image. This new feature leverages machine-learning to allow shoppers the use of images when searching for matching items. Anything that makes shopping easier for the consumer is a good thing.

Screenshop by Craze is Kim Kardashian’s image search app. All the user needs to do is upload a screenshot of a look you like, and it becomes immediately shoppable on Screenshop. Not only does it show the item you are looking for, but it also upsells the user by showing similar clothes and accessories in addition to more affordable options. This app becomes users’ own personal shopper. It can be accessible at any time and allows the user to browse at their own convenience.

What both of these companies are doing is using contextual search as a means to personalize the shopping experience.

Marketers using image recognition

What may seem like a picture of your friends at a restaurant is much more than that. Basic elements such as location, tagged friends and hashtags give some identify qualities. However, image recognition analyzes everything within the image. It can tell if you are drinking beer or vodka by processing the color of liquor in a glass and even get the name of the restaurant from the menu on the table. The brand and fabric of clothing can also be identified.

This kind of information provides many data points as to who we are as consumers, and of course, is very valuable to marketers. If you are an influencer who only wears designer clothes and drinks champagne, advertisers are then given that information to retarget the user with relevant items. Marketers and tech companies are already using the everyday search info to provide contextual ads. After all, we’ve all experienced looking at an item on online and then having it follow you around everywhere by retargeting.

The future

Sentiment analysis is a newer entrant into the field of image recognition.

Facebook has several patents in this area, one of which is described as “techniques for emotion detection and content delivery.” What this does is use a camera to track emotional states while viewing different things, such as puppy videos, and serves up content in the future by just reading the users emotional state. This type of image recognition adds an entirely new layer into personalization.

However, it’s not perfect by any means. There are many instances of image recognition falling very short. An example would be if the same person has a different haircut or color, the computer will sometimes identify it as two separate people. In reality, image recognition can be fooled into thinking a person is someone else if there is a slight change or small nuances that the computer is not yet programmed to recognize.

For the past 50 image years, image recognition has evolved tremendously. Facial recognition companies such as RetailDeep use a smartphone camera to identify a shopper the minute they walk into a store, and the sales staff has access to their entire order history, bridging the online world with the offline. In the future, devices such as Echo Play and Google Hub may even use the front-facing camera to understand facial emotions during bi-directional conversations.