Ethan Zuckerman’s online home, since 2003

Eli Pariser on Filter Bubbles

Eli Pariser offers the idea of a personalized conference. What if we came to an event like Personal Democracy Forum, and sorted ourselves by gender, age, political ideology, hometown. Pretty soon, we’d all be sitting in small rooms, all by ourselves. What if speakers then offered personalized talks, adding explosions for the young male listeners, for instance. “Yu’d probably like your personal version better… but it would be a bad thing for me to do.” It renders moot the point of a conference – we no longer have a common ground of speeches that we can discuss in the hallways.

Google uses 57 signals available to personalize the web for you, even if you’re not logged in. As a result, the results you get on a Google search can end up being very different, even if quite similar people are searching. Eli shows us screenshots of a search for “BP” conducted by two young women, both living in the Northeastern US. They get very different results… one set focuses on business issues and doesn’t feature a link on the oil spill in the top three, while the other does. And one user got 141 million results, while the other got 180 million. Just imagine how different those results could be for very different users.

Facebook also engages in customization, using information on the links you click to customize the news that appears in your personal feed. Eli tells us that he’s worked hard to add conservatives to his circle of friends and follow them on Facebook – why wasn’t he getting news and links from them. Well, Facebook saw he was clicking more links about Lady Gaga and progressive politics and customized his experience to filter out conservative links.

Eli terms this phenomenon a “filter bubble” – a special sort of echo chamber. The better our filters get, the less likely we are to be exposed to something novel, unexpected, or uncomfortable. This has always happened – media always lets us choose more familiar and comfortable perspectives and filter out others. But filter bubbles differ from the past in three ways:

- The degree of personalization is higher. You’re no longer just hanging out with the other thousands of readers of The Nation – you’re alone in your bubble.

- They’re invisible. Google doesn’t tell you it’s personalizing your bubble, which means there are big unknown unknowns.

- You don’t choose the filter – it chooses you. You know you’re choosing partisan news when you look at Fox News or Democracy Now, but it’s increasingly impossible to escape the filter bubble.

We thought the battle on the internet was to defeat the censors, to get the story out of Iran around filters and the police. We thought we needed to circumvent the biases of traditional media gatekeepers. But now we’re facing a re-intermediation, this time by algorithms, not by individuals.

We need filters – there’s more information created in a single year now than was created from the beginning of human history through 2008. But we need to think of the values embedded in these filters. Facebook’s filters have something to do with the statement Eli attributes to Marc Zuckerberg that a squirrel dying in front of your house might be more important to you than people dying in Africa. (I haven’t been able to source this quote – I’ll see Eli later today and ask for a footnote.)

Personalization is a great corporate strategy, but it’s bad for citizens. These filters could lead to the end of public conversation, Cass Sunstein worries, or the end of the public. But humans created these tools, and we can change them. We might add a slider to Facebook that lets us see news that’s more or less homogenous.

First, we need to get over the idea that code is neutral – it’s inherently political.

Eli invites us to continue the conversation, using the #filterbubble tag. I look forward to connecting him with some of the writing I’ve been doing the last two years on homophily.

4 Responses to “Eli Pariser on Filter Bubbles”

  1. JD Lasica says:

    Excellent summary, Ethan. Eli’s talk was the first convincing presentation I’ve seen about the downsides of the personalization bubble. Glad to hear this will be dissected at greater length during Saturday’s PDF unconference.

  2. Jotman says:

    Ethan, this is a most timely topic. The notion of “filter bubbles” first came to my attention with respect to news media coverage of the Gaza aid flotilla.

    My initial observation:
    http://jotman.blogspot.com/2010/06/cnn-coverage-of-attack-on-gaza-aid.html

    Further analysis:
    http://jotman.blogspot.com/2010/06/filter-bubbles-how-want-to-know.html

  3. Ethan makes the point that, “The better our filters get, the less likely we are to be exposed to something novel, unexpected, or uncomfortable.” The truth is the exact opposite. :(

    Now, as we stand today, filters do work with our impulses. However, this is because of a lack of profile data on who we are and what we intend or need to do. For filters to work correctly we need to have data being sent and received in real-time for both the providers and receivers of internet content.

    So, your point is valid, but only for now. Filters will increasingly give what we need and be able to challenge us through intelligent prediction.

  4. Jacob Stærk says:

    The Zuckerberg quote reminds me of a another qoute from the penny press.

    “A dog fight in New York is more important than revolution in China.” Not sure, but I believe it was Alexander Hamilton who said it.

    It makes sense if Facebook holds the same values as the penny press did — in a social media version: Everyday, local, human interest, privat etc.

Trackbacks/Pingbacks

  1. Google Fellow at the Personal Democracy Forum « Can? We? Save? Africa? - [...] Whether technology exhilarates (using kites to map the impact of the oil spill) or depresses (Eli Pariser talked about …
  2. New Digital Divides: The Personalized “Filter Bubbles” Menacing Democracy | Information Personnes / Persons Information - [...] Pariser is talking about the customization of our relationship to the world around us. For example, Zuckerman et Marcia …
  3. Highlights of Personal Democracy Forum 2010 - [...] The New Digital Divide (CauseGlobal) Ethan Eli Pariser on Filter Bubbles (Ethan [...]
  4. Personalised to death: Filter bubbles « The New Media World - [...] http://www.ethanzuckerman.com/blog/2010/06/03/eli-pariser-on-filter-bubbles/ [...]
  5. Reboot - [...] their established and preferred worldview. MoveOn.org co-founder Eli Pariser has coined the term “filter bubbles” to describe our increased …
  6. The Big Bad (Filter) Bubble. Really? « Blogging from Binney St - [...] terry239328 Politics, Technology Leave a comment Had the opportunity to listen to …
  7. Making news content more transparent « Sameer Padania - [...] to get there, but it’s still reliant to some degree on search (and search itself is an increasingly personalised rather than …
  8. …My heart’s in Accra » In Soviet Russia, Google Researches You! - [...] their experiment correctly isolates the factors involved with personalization. Eli Pariser, in his talk last year at PDF and, …
  9. …My heart’s in Accra » How diverse is your social network? How diverse should it be? - [...] friends so I have a better understanding of the issues and concerns of that community. That’s what led Eli …
  10. Phrase of the Day: filter bubble | The Big Picture - [...] we are to be exposed to something novel, unexpected, or uncomfortable. —Ethan Zuckerman, “Eli Pariser on Filter Bubbles,” My …

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

 

Powered by WordPress | Designed by Elegant Themes