Adobe Is Launching AI-Powered Voice Analytics

It's using Sensei to understand big data trends

Adobe's launching an analytics tool to help marketers understand how people use voice assistants. Photo Illustration: Yuliya Kim; Sources: Getty Images
Headshot of Marty Swant

To better understand how people consume media and marketing via voice-enabled devices, Adobe is adding voice analytics to the Adobe Analytics Cloud.

Today, the company is launching a service that will track how users interact with the tools that media companies and brands build for Amazon’s Alexa, Google’s Assistant and Apple’s Siri. The voice analytics offering will also integrate Sensei, Adobe’s artificial intelligence and machine learning service, which will be able to make sense of trends within massive data sets. For example, with the software, companies developing skills (essentially the voice equivalent of apps) will be able to track performance across devices, platforms and how it all fits into the overall customer journey.

The market for voice-enabled devices and voice assistance continues to grow rapidly, thanks in large part to the popularity of Amazon Echo and Google Home. (Apple recently announced its own Siri-enabled speaker, HomePod, which will be released later this year.) Sales of voice-enabled devices grew by 39 percent year over year, according to Adobe’s analytics data, and a recent report from analyst Mary Meeker found that 20 percent of mobile search queries in 2016 were made by voice.

According to Colin Morris, director of product management for Adobe Analytics Cloud, customer analytics have played more of a front-seat role than in the past, developing customer profiles beyond digital behavior. The move is the next step in the evolution of Adobe Analytics. In 2009, it bought Omniture, which helped with web analytics, followed by digital and marketing analytics.

“The fact that we can return errors or null statements to understand what the voice assistant got and what it didn’t, I think it is important for developers and product managers to create a better experience,” Morris said.

According to a report from Gartner, half of all large organizations will use data from connected devices to drive advanced analytics and algorithms. However, Morris said one of the tricky things is collecting data in silos—such as across voice, Internet of Things devices, mobile web activity or app usage—but that makes it difficult to apply to understanding customers.

With Adobe’s analytics, one way brands can track voice usage is based on intent (what they’re trying to do, such as playing music or ordering something) along with parameters (such as playing a specific song or ordering a specific item). And over time, analyzing the data can help understand how much a certain consumer is worth based on their behavior.

If a user is interacting with a brand for the first time, a brand can measure top-of-the-funnel metrics, while also measuring what happened over time for them and for the brand. On the other hand, they can measure trends and patterns at scale over time. (Brands can look at how requests are converting, and media companies can measure whether a person tunes in and for how long.)

“You can start to really understand and interesting parsing implementation of people based on the behavior that they’re doing,” Morris said. “Which is at the end of the day the name of the game, right? It’s kind of like what is the most valuable action that can happen in this experience? Who’s doing it? What’s different about them than everyone else? And either you have a marketing problem and have to go buy lookalike users or you’ve got a product problem and graduate my users into a different lifetime value bucket.”

@martyswant Marty Swant is a former technology staff writer for Adweek.