Sleeping Giants Co-Founder Sees Natural Language AI as Next Step in War on Disinformation

Activist and agency veteran Matt Rivitz on his next project

Photo of Matt Rivitz and the Sleeping Giants logo
Sleeping Giants founder Matt Rivitz says programmatic is as full of fake news as ever. Matt Rivitz, Sleeping Giants
Headshot of Patrick Kulp

Key insights:

Matt Rivitz, co-founder of the advertising activism org Sleeping Giants, spent the months leading up to the coronavirus pandemic trying to reimagine what an ad network could be. Partnered with a non-governmental organization, the agency veteran wanted to formulate a more ethical version of the business model he sees as profiting off of misinformation and hate.

But as the onset of the pandemic made investment options dicier—and Rivitz began to better appreciate the scope of change that would be needed—he switched gears. He has now started working with a startup called Nobl that uses language detection to identify patterns around conspiracy theories and hate content and funnel programmatic money that would be spent on it to more quality outlets.

Meanwhile, the normal work of Sleeping Giants and the model of consumer activism it helped pioneer continues as ever. This week, several advertisers including T-Mobile, Disney and Papa John’s distanced themselves from Tucker Carlson’s Fox News show over comments he made about Black Lives Matter protests after widespread backlash.

Adweek sat down with Rivitz to talk about the new undertaking, why platforms still aren’t doing enough to crack down on misinformation and the problems with programmatic business models.

This interview has been edited for length and clarity.

Adweek: Why did you choose to team up with this company specifically?
Matt Rivitz: Basically it provides some pretty radical transparency to a business that doesn’t have any right now.
I’ve had so many advertisers come to me and say, “Can you give me your blacklist?” and I never wanted it to be my point of view, because my point of view doesn’t represent everyone. You kind of have to know it when you see it. We wanted to create something that can do it through language instead of doing whitelists and blacklists and having to keep all those decisions under wraps. It’s just basically like, “We’ll show you where you are according to our algorithm. And you can you can make that call.”

What kinds of problems with the blacklist system led you this?
It’s inefficient. It doesn’t account for all the sites that open every single day that are going to be trying to take advantage of the system. We’re heading into an election, and there’s going to be a vast number of sites that are going to open that are going to be highly inflammatory on any side of an issue, and they’re going to be able to monetize themselves because the system is so opaque. Google and Facebook aren’t really doing a great job—in my mind—of policing their ad network because they want to monetize everything.

Platforms like Facebook and Google are also using natural language AI to root out some of the same content you are talking about. Why are those efforts inadequate in your eyes?
It’s not in their business interest. It’s not my business interest either, it’s a mission-driven thing for me. But I don’t need to make a billion dollars—I just want to, I want to clean up the internet. Right now, the business model is to reward engagement. And it just so happens that hate and divisiveness and conspiracies have more engagement than facts. So in order to get that, we need to change the game a little bit.

What do you think of Twitter’s recent efforts to add more fact-checking disclaimers and Facebook’s refusal to do so?
Look, Twitter has avoided doing the right thing for a very long time. And they’re rightfully enforcing their terms of service a bit now, but haven’t gone all the way. They’ve definitely made bigger steps than most platforms, I will say, and trying to root out a lot of hate and harassment on the platform. It still happens all the time—it happened to me the other day again, someone published my address. And it happens all the time now, it’s no big deal anymore.


@patrickkulp patrick.kulp@adweek.com Patrick Kulp is an emerging tech reporter at Adweek.
{"taxonomy":"default","sortby":"default","label":"","shouldShow":"on"}