Anonymous Apps Try to Curb Bad Behavior is investing millions to ensure a safe user experience was well aware of the controversy surrounding anonymous app before purchasing it last month, according to its CEO Doug Leeds. Here, the exec, who now also oversees, talks about what the company is doing to make the app a safe environment for users and the app's plans for advertising. CEO Doug Leeds oversees app, which IAC acquired in August.

Did the controversies about give you pause before buying it?

Yes, naturally. Issues of user safety and trust on a global communications platform like are very real, especially where younger audiences are concerned. This is serious stuff. We knew taking this on meant significant investment in making the platform materially safer, and we are committed to doing what it takes to make that happen.

How are you making the service safer?

What I know from my years running global product justice and policy at Yahoo is that safety is a journey, not a destination. It takes true partnership between industry, government, nonprofits, parents, educators and caregivers to create a path forward that’s meaningful and effective. To that end, here is what we are already doing and publicly committing to doing within the first six months.

Putting the right leadership in place. We’ve parted ways with previous leaders of the company, who didn’t share our vision around safety, and brought on a team of digital safety experts, led by our recently appointed chief safety officer Catherine Teitelbaum.

We took it upon ourselves to proactively pursue conversations with state regulators who had previously voiced concerns. In Attorneys General [Eric] Schneiderman [of New York] and [Douglas] Gansler [of Maryland], we found like-minded partners with similar visions. Together we’re focused on instilling the right policies and procedures to ensure a safer and more enjoyable experience for millions of users.

The ability to effectively filter and moderate the enormous volume of content on is a key pillar of our approach. We plan on investing millions in automated filtering technology as well as dramatically scaling human-powered moderation capabilities across 20-plus languages with native language community moderators. All content on will be in some way “touched” by a filter, be it automated or via a real person.

As for existing safety features, the product today does provide the ability to block users or anonymous questions and flag and report content. We think all of these features can be better showcased and optimized, and are hitting the ground running with a plan to accomplish this over the next few months.

What is Catherine Teitelbaum’s mandate?

Catherine has held senior positions at Yahoo, including director of global safety and product policy. This, combined with her years of experience as a professional educator, gives her tangible, real-world perspective on the digital behavior and requirements of younger audiences. Catherine’s immediate priority is setting the direction and strategy for safety measures within the product, specifically moderation, as well as spearheading partnerships with third-party mental health services and support groups.

How would you assure brands is a good place for their messages?

We are currently focused on improving the trust and safety of, so that the value proposition already clear to millions of people becomes known to millions more. And that naturally, but not specifically, includes brands looking to engage this audience. Questions and answers are the building blocks to conversation, so I clearly see a day when the platform can deliver brands a unique way to connect with millions of people in a conversational way.

Right now, runs banner ads. Do you have plans for more aggressive ad units?

Our immediate plans are to reduce the aggressiveness of ad units and monetization in general. Ad revenue is not a short-term metric we care much about at this stage for