Watson won’t be wearing anything fancy to the Grammys this weekend, but that’s not going to keep it from judging everyone else’s outfit. For the 60th anniversary of the music awards, the Recording Academy is partnering with IBM to bring its artificial intelligence to the red carpet.
On Sunday night, IBM will deploy its AI platform to analyze videos and photos of nominees and attendees as they arrive at the ceremony in New York. In addition to identifying each person, Watson will be able to understand styles, learn about this year’s fashion trends and compare them to those of previous years. People will then curate those findings, along with a selection of the many photos and videos taken by photographers, and upload them to the Grammys website for fans to learn more about their favorite musicians along with those honored decades ago.
Watson’s ability to rapidly ingest information and analyze it is one of the many capabilities that have made the AI platform famous since it first won Jeopardy in 2011. During the Grammys, IBM officials estimate it will analyze at least 100,000 photos and hours of video that will then be curated for humans to publish online during and after the event.
Beyond analyzing fashion trends, Watson will also analyze lyrics for every song from this year’s nominees to identify trends in themes and emotions as well as how they compare to the last 60 years of lyrics.
“We want to give fans a closer connection with the music and the artists and the stories that they love, and we want to do it in a way that’s more seamless, more real time than ever before,” said Recording Academy CMO Evan Greene.
The tools Watson will use during the Grammys red carpet are part of a new platform called Watson Media, which IBM debuted in September during the U.S. Open, where IBM and the U.S. Tennis Association taught Watson to play tennis and analyze the biggest moments of each match before publishing content on an app and other platforms.
According to John Kent, a program manager at IBM, Watson will be able to know whether a photo is blurry or if an attendee’s eyes are closed. It will also be able to recognize colors and shapes in order to see how it compares to color dominance in the past. On the video side, Watson will mark when a musician first appears on the red carpet and when they disappear.
“It’s not quite as simple as ‘Here’s a picture of Carrie Underwood; here’s not a picture of Carrie Underwood; here’s another picture of Carrie Underwood,’ and do that 10 times,” Kent said. “That’s literally the process, but sometimes, there is also work to be done to prepare the images for that analysis.”
Because the shelf life of real-time content is so short, Watson’s ability to analyze data faster than humans helps people engage with more content.
Watson has become somewhat of a star in its own right over the past few years for a variety of cultural moments and other campaigns. In 2015, IBM ran a campaign featuring Bob Dylan in which Watson analyzed all of the songwriter’s lyrics to identify key themes. In 2016, it partnered with the high-end fashion brand Marchesa on an LED-lit dress for the Met Gala that lit up based on social media sentiment around the event. And last year, it designed a sculpture for Mobile World Congress in Barcelona based on drawings and writings from the Spanish architect Antonin Gaudi.
“We’ve endeavored to tell the story of music from a variety of different vantage points and perspectives,” Greene said. “We’ve told the story of music from the fan standpoint, from the artist standpoint on the journey to the Grammys stage. We’ve told the story from the perspective of music itself and what it means to us as a society and as a culture.”