How Watson Is Digging Through Decades of Video to Help Find New Sources of Revenue

IBM's supercomputer is mining for metadata, with promising results

If Food Network execs wants to know which ingredients are most popular with viewers, they can just ask Watson. If TED members want to know the secret to happiness, they can just ask Watson.

The IBM supercomputer, which burst onto the scene in 2011 when it won Jeopardy, is now being tasked with helping media companies bring new life—and new revenue streams—to millions of hours of video.

With a full 80 percent of online traffic coming from video content, there’s a lot of data Watson can crunch, past, present and future.

“There’s massive amounts of content coming onto the cloud,” said Dave Mowrey, head of product for Watson Media & Entertainment. “Being able to run that huge amount of content through Watson, we can extract the value out of those videos by pulling metadata out.”

And those data points are giving content owners, brands and advertisers previously untapped opportunities.

Mowrey says Watson has the ability to understand the “sentiment, tone, natural language and visual analysis” of a video, whether it be a TED Talk, the Masters golf tournament or a cooking competition show on Food Network. That helps content creators increase engagement through recommendation, and creates new audiences for advertisers for content producers. Mowrey says the data will also “feed back into the content creation process.”

IBM is showing off Watson, and other cloud-based products, at this week’s NAB Show in Las Vegas. Mowrey told Adweek that clients who’ve stopped by have been impressed.

“It’s incredible to see the excitement and the brainstorming they have around use cases I haven’t even thought of, from  a compliance perspective, from a brand/advertising perspective., both live and video on demand content.  Everybody’s got different use cases,” he said. “It’s been amazing to see the reaction.”