Yik Yak co-founders Tyler Droll and Brooks Buffington spoke in front of about 100 people earlier this week at South by Southwest Interactive. The pair arrived wearing matching Yik Yak socks and joked on stage that sock sales are their top revenue generator.
But the panel discussion wasn't all fun and games. During a Q&A, a few audience members asked what they plan to do to make sure the app—that's popular on college campuses and lets users post anonymous messages, or "Yaks," people within a certain geographic area can see—is used safely. One questioner who wouldn't give her name or the campus where she works said she uses Yik Yak to listen to students, and that most of them use it for requests unrelated to homework.
"They have questions about physics problems and people feeling bad about themselves receiving anonymous support, but I've got to tell you, the primary thing my campus uses Yik Yak for is hookups," she said. "Sex and drugs—that's 90 percent of the Yaks that I see."
Many critics of the app say its anonymity fosters bullying and online harassment. However, Yik Yak has worked to provide more accountability and just last week introduced username handles that could help it identify how users are interacting on the app.
The same woman asked if the recent rollout of handles for some users might help bring about a more identity-based system.
"Are you going to move to more of a handle-based system, or is anonymity going to continue to be an option so that people can keep bullying each other and putting out requests for sex in the library, drugs?" she said.
Droll said handles are meant to help drive conversation, and if it helps "foster the best communities," that's the route they may take. But moving completely away from anonymity isn't in the works, at least not yet.
Another woman—who works in the tech industry and has a seventh-grade son—drew applause when she said she thinks more civil discourse is needed and asked if one way to make money would be to let her pay to block her son's IP address.
"[Negativity] really is a tough problem to deal with," Buffington said. "On any form of social media, there's the great and the not so great side of things. Internally, we put a lot of time and effort into making sure that our systems are super robust and that the community has a lot of say over what's OK and what's not OK to say in the community."
Buffington said when problems do arise, they're dealt with quickly. However, he said, communities sometimes don't realize the power they have to condemn harassment and correct it—that's partly why they have filters and reporting techniques.
"Things will always get through; bad things will happen," he said. "But I think a lot of it comes down to what is our reaction, and how are we trying to better things on a daily basis?"
Droll and Buffington also said they're more focused right now on growing users than on growing revenue.
"Down the road, there are awesome opportunities that are just natural that come with working with great local communities—getting small businesses involved, getting schools involved. There are so many," Droll said. "But right now, our focus is just continuing what we're doing: building local communities."