It’s been more than a year since Snap Inc. CEO Evan Spiegel rebranded Snapchat as a camera company. The company certainly has changed the way we see the world through the lens of social media. It’s taught users to overlay their faces and the real world with augmented reality. The circular cameras on Snap’s Spectacles capture short scenes at eye level. And it heralded an era of vertical video.
However, it’s paid far less attention to another question: How do we hear the world?
Last week at Cannes Lions, Snapchat chose to focus less on sight and more on sound with an installation exploring audio trends within users’ posts. The exhibit, “Sound Stories,” was created by artist Christian Marclay using machine learning and human curation of hundreds of thousands of public snaps.
According to Andrew Lin, Snapchat’s lead engineer for the project, the idea came about last fall soon after the company announced that more than 3 billion snaps were created on the app every day. While thinking of how to celebrate the milestone, someone suggested making a movie montage. As an idea for displaying how Snapchat posts play out over the course of a day, another employee suggested creating something inspired by Marclay’s artwork, “The Clock,” which featured a 24-hour loop of video footage from film and TV shows.
Marclay asked Snap’s team how they might sort through the billions of snaps posted on the platform to explore the audio in videos. Lin’s team, which manages Snapchat’s curation tools, began building a “pitch library” by analyzing the frequency of sounds within certain segments and then looking to see which notes they line up with on a piano. That required a fleet of computers to do the analysis of more than 100 million snaps.
“When I’m talking right now, there is probably a huge spectrum of frequencies that you’re hearing,” Lin said. “Your brain can pull them apart and comprehend what’s happening.”
Marclay wanted to know how to find all the piano sounds, along with guitars, cats and even cats walking across pianos.
That led to five different pieces within the Cannes exhibit, which Marclay wanted to make interactive.
“How can I make the series of installations that would then draw the viewers in and make them aware that they have in fact a tool that they can use differently, maybe more creatively than they do,” Marclay said in a video about the exhibit.
One piece, “All Together,” featured 10 smartphones playing a piece of music composed with milliseconds of notes from 400 snaps. Another, “Tinsel Loop,” recreated Marclay’s 2005 score “Tinsel” and used video fragments instead of instruments. And yet another, “The Organ,” let visitors play a keyboard which drew from a library of millions of snap audio files along with the images they were created with.
But the sounds weren’t necessarily always pleasant. In fact, some were almost eerie—an amalgamation of the moments they represented that were created far away from the south of France.
Cannes wasn’t the first collaboration between Snap and an artist. Last fall, the company created augmented reality lenses with American artist Jeff Koons. The lenses, available in nine cities around the world, unlocked one of Koons’s famous balloon dogs.
Throughout the process of creating “Sound Stories,” Marclay and Snapchat were able to identify themes that might not have otherwise surfaced had they merely watched instead of listened. For example, they realized the iMessage notification—a C7 on the piano—was included in millions of snaps. Others featured people tapping on their phones or pouring candy into bowls.
“That’s a sign of how prevalent that noise is in our lives,” Lin said about the iMessage sounds. “People are doing all sorts of things and that bing is all over the world. The exact same bing, the exact same note played so many times.”