Advertisement

Fast Chat: Our Information 'Filter Bubble'

MoveOn.org founder on why personalized news is dangerous

Eli Pariser | Credit: Jen Campbell

Advertisement

Adweek: Your book is called Filter Bubble. What’s that?
Eli Pariser:
On the Internet, sites like Google and Facebook are only showing us the results that they think we want to see. If you Google "Egypt" and I Google "Egypt," we will get very different search results. And it’s not just happening at Google. One of the biggest projects many of the biggest technology companies and websites are working on is personalizing what we see. So the filter bubble is the unique universe of information we see when surrounded by all these filters.

What's wrong with that?
It surrounds you with all the things that algorithms think you’re most likely to click. So it will reflect your interest and your points of view, but won’t expose you to things that you need to know that aren’t quite as clickable. So here’s an example: Afghanistan is something we really need to know about. But it’s not going to do as well in these personalized algorithms because it’s not as clickable.

Like an "eat your vegetables" approach to information? Why should a business care about feeding people their vegetables?
In the long run it’s a much more satisfying experience to have those things that get you out of your bubble and to be exposed to new ideas and not just to be pandered to. The best media sources give people information dessert and information vegetables. If you think about a long New Yorker article that has cartoons on every page, it’s the perfect example. There’s tidbits that make you laugh and also a 50-page article on Scientology.

Have you been accused of being a Luddite?
Not really. This is what I do, messing around with technology and trying to build tools that help people do stuff online. That’s what I’ve been doing at MoveOn. I love the Internet. I want it to be as good as we all hoped it would be back in the ‘90s. I feel like it's falling short, and I want to give it a nudge in the right direction.

Have you gotten any feedback from Google or Facebook?
I have talked to a bunch of people at Google and Facebook. I would say there are two categories. One category is summed up by a Facebook engineer I talked to who said, "Look, it's very easy to make changes to Facebook that keep people on the site longer. It’s hard to figure out how to build social importance into the news feed. You’re asking us to do something that’s much more difficult and won’t make us as much money in the short term."

The other school of thought, which I’m more inspired by, is engineers who actually feel like it's one of the most important, exciting, and difficult, but interesting challenges facing software writers right now. It falls below a lot of the other things on the priority list for these companies. We’re hoping that as consumers, we’re educated enough and understand how [the filter bubble] works and why it’s falling short. And that that will create the pressure to bump it up as a priority for Facebook.

Right after printing cash, taking over the world, and spying on everyone.
Ha, yeah. One problem is the news industry certainly hasn’t been doing as well for itself. But for a long time, in terms of profits, the major news networks were not hurting for money. And they had programming designed to introduce people to important news and not just treat them like consumers. If those companies could make that decision to become something more than just a business, so could Google and Facebook.

But news organizations don’t necessarily have that luxury now.
In a way there’s a passing of the torch. Now Google and Facebook are the ones rolling in money, and they could pick up the slack and allow people at their companies to make decisions that positively impact society.

The advertising industry loves personalization. How prominently do you think advertising interests play into Google's and Facebook’s decisions that affect our media consumption? 
 I'm not as concerned about advertising that’s targeted as I am about content that’s targeted.  It’s one thing to personalize products, and it’s another thing to personalize information. When you’re showing people news based on who you think they are, you can really miss important things that are happening in the world. One of my favorite quotes from Mark Zuckerberg is, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." I think people need to hear about both.

I bet he’d argue that he doesn’t have a responsibility to inform people about Africa.
At times, Facebook says, "Nobody is forced to use Facebook, and we don’t have a responsibility to anyone." But at other times, [Zuckerberg] talks about creating this new world of transparency and connectedness, and if he wants his products to be transformative in that way, he needs to grapple with this responsibility as well.