Your Results May Vary

Eli Pariser Makes the Case Against Internet Personalization

In Squaring Off, Zócalo invites authors into the public square to answer five probing questions about the essence of their books. For this round, we pose questions to Eli Pariser, author of The Filter Bubble: What the Internet Is Hiding From You.

Surfing the Internet has become a highly personalized experience, with sites like Facebook and Google tailoring what you see based on troves of data they’ve collected about you. Pariser, the former CEO of MoveOn.org, argues that personalization creates a “filter bubble” that isolates us from each other and “distorts our perception of what’s important, true, and real.”

1) I don’t feel like I’m missing anything in my personalized Internet experience, and it’s saving me time, effort and money – is that really so bad?

Because this personalization is all happening invisibly, you don’t know what you’re missing. At least when you turn on FOX News or MSNBC, you have a sense of what the editing rule is – how that perspective is distorting the real world. But when you’re searching on Google, or loading Yahoo News, you don’t know on what basis it’s editing what you see. That’s the problem: It’s a product that’s almost impossible to evaluate.

2) So what’s the practical alternative? We can’t have a team of humans editing the whole Internet, and I certainly don’t have time to filter that 5 billion gigabytes that former Google CEO Eric Schmidt says is produced every two days.

The best alternative is code that filters in a smarter way – showing us not just the stuff that’s most compulsively clickable (ahem, cat videos, I’m looking in your direction) but the things that really matters as well. Facebook needs an “Important” button to sit next to the “Like” button.

This is one of the challenges with the algorithms that do a lot of the recommending and editing online these days. They’re very smart (Five exabytes of data! Millions of servers!), but they’re still not nearly as smart as human editors, who do things like anticipate what we will need to know, provide a blend of information that complements each other, and challenge us sometimes to think about things differently. The challenge at the moment is that the code’s good enough to make money for these companies (a lot of it), and it’s not bad when it comes to targeting ads. But when it comes to targeting information, it can really give us a distorted picture of the world.

3) So how would you improve the code?

It just hasn’t been a priority to really build in the best parts of journalistic ethics: You hear what you want to hear but not necessarily what you need to hear.

There are lots of places these companies could start – ways of bringing in information that does give us a better sense of the whole picture, or challenge our views. The “Important” button I mentioned is one example. But first they have to decide they want to be more than just profit engines, that they want to serve the society well in this way.

4) But if what I want out of the Internet is cat videos and celebrity gossip, who is Google or anyone else to tell me I have to read about war and famine too? Isn’t it a bit paternalistic to say I shouldn’t have the Internet experience I want to have?


I’d say it’s paternalistic for Facebook to say, “We’re going to decide which friends you see and which ones you don’t, and there’s not much you can do about it.” These algorithms already have a lot of values built in – the very process of making a ranked list is one of choosing which piece of information is more relevant and true than which other one. There’s no such thing as “neutral” code.

We all have a more short-term self that just wants to watch cat videos and a more aspirational self that wants to be knowledgeable about the world. The best media helps us balance those two things. This doesn’t.

5) You say a personalized Internet goes against Tom Friedman’s idea of the web as a “global village” uniting people from diverse backgrounds, but how can we create that village in an online space that’s so vast and disparate?

Well, you want a mix. In the last chapter of the book I describe how one of the foundational urban planning books, A Pattern Language, pictures the ideal city. It’s a challenge: if you mix everyone together evenly, so that each neighborhood contains lots of different ethnic groups and ages and kinds of people, you actually end up with a very boring, homogeneous city – it’s all the same. Conversely, you can create a “city of ghettos” where each group lives in its own sealed world, and you lose many of the benefits of having so many kinds of people living in close proximity.

What A Pattern Language suggests is that you want a “mosaic of subcultures” – a city where cultural and ethnic identities are localized enough to grow, but connected enough that people bump into and learn from each other. That’s more or less what I think we want the Internet to look like: some personalization, but also a lot of connection to places far different from our own ideological neighborhoods.

Buy the book: Skylight Books, Powell’s, Amazon

*Photo courtesy of The Ticket Collector.