Imagine walking into a grocery store, toward the produce aisle, but then getting a ping on your phone that the cookies you bought last week are on sale. You add them to your cart, put your phone away and keep shopping.
This is one scenario that could very well become a reality with a new retail innovation project from Adobe Labs, revealed exclusively to Adweek. The technology can track live foot traffic in a store and break down shoppers into a variety of data segments (through Adobe Analytics) like people who spend a lot of money, the type of products they buy (organic, gourmet, sweet tooth) and more.
While Adobe Labs is currently demoing the project with grocery stores in mind, it could be applied to clothing and home improvement stores.
The project is built on the Adobe Cloud Platform and pulls in data from Internet of Things sensors, beacons and if the brand has one, an app. Additionally, depending on the retailer, data is also taken from point-of-sale systems and online data.
“Previously ecommerce has been the primary real time testing ground for delivering a curated, relevant experience to learn about your customer and can be used to inform other marketing channels,” said Jonathan Fanucci, vp of performance media at 360i. “This technology puts physical locations on the same playing field, and if not at least gives an advantage in acquiring a customer and building brand equity.”
Retailers can take a look at a live foot traffic map and segment their shoppers, as well as see how many people are currently waiting in a checkout line and any other issues going on in the store (like a cleanup or inventory running out).
If a store manager wanted to gain better insight on a customer, they can click on one of the moving dots and see their visitor profile. There, they can find out demographic information about them like if they’re married, where they live, what kind of device they’re using and how much they usually spend. The idea behind this profile is to understand the customer’s wants and needs and then target certain offers to them.
The offers portion, powered by Adobe Sensei, the company’s artificial intelligence machine learning tool, can be pushed to a customer via an app notification or on a screen in-store.
“Having the ability to activate against a customer online and in-store indicates the allowance to measure essentially in a multi-touch attribution fashion to effectively evaluate the success of various campaigns or marketing channels,” Fanucci said about such a technology’s existence.
Adobe Labs, which is unveiling the project at the National Retail Federation conference in New York from January 14 to 16, is currently focusing on grocery stores to help them potentially fix their waste problem.
“If you have day old bread, a lot that gets thrown away,” said Kevin Fu, an Adobe spokesperson. “But if you’re able to line up day old bread with people who are more likely to accept that offer and over time know who’s more responsive, then you’re able to throw away less food and manage your inventory better.”
Adobe imagines all this data can be aggregated in various ways: a customer can download an app and opt in to be tracked, a smart shopping cart (Walmart has a patent for one) to see how someone goes through the store, or a grocery store can push certain offers to in-store screens when someone is standing in front of them based on any other data collection it has.
With this technology, Adobe is trying to bridge the offline and online space of how retail exists.
“We also recognize that while there’s double digit increase in the ecommerce space, the physical is still important,” said Michael Klein, director of industry strategy, retail, travel and hospitality at Adobe.