Friday, June 4, 2010

Filter bubbles: How "want to know" kills "right to know"

Sometimes the worst thing is getting what you ask for.

Ethan Zuckerman, describing "filter bubbles," writes:
We thought the battle on the internet was to defeat the censors, to get the story out of Iran around filters and the police. We thought we needed to circumvent the biases of traditional media gatekeepers. But now we’re facing a re-intermediation, this time by algorithms, not by individuals.
Eli Pariser, who coined the term "filter bubble," recently described how Google and Facebook use algorithms.  Zuckerman writes: 
Google uses 57 signals available to personalize the web for you, even if you’re not logged in. As a result, the results you get on a Google search can end up being very different, even if quite similar people are searching....

Facebook also engages in customization, using information on the links you click to customize the news that appears in your personal feed. Eli tells us that he’s worked hard to add conservatives to his circle of friends and follow them on Facebook – why wasn’t he getting news and links from them. Well, Facebook saw he was clicking more links about Lady Gaga and progressive politics and customized his experience to filter out conservative links.
If you visit the blog of Craig Murray, you can see where many of his readers have been deeply perplexed by the coverage of the Gaza aid flotilla on Google News:  "Why do you think the most covered story by far over the past few days does not appear in Google's list of most covered stories?" asks one reader.  Another: "You have to wonder what munging of the search and indexing system produced a result that has had no mention of by far the biggest story of the past few days."   Another commenter responded, " I suspect that search results are modified for each individual computer user account; that is the sort of thing Google specialise in. Try clearing cookies, or try from a different machine, and see if results vary."   Some commenters speculate the censorship is intentional, but given ubiquity of the algorithms, it's hard to say.

Back when the story was "breaking news," I attempted to use CNN's US website as a platform for learning about the Israeli attack on the Gaza aid flotilla.   In this blog post, I described how my experience on the CNN website had been shaped by collaboration between Facebook and CNN.  I contrasted my personal experience with the observation of a Jotman reader.  The reader had used the same website around the same time yet unmediated by any "filter bubble."

To me, the contrast illustrated how it would now be possible for the press to 1) satisfy to the curiosity of knowledge-networked individuals on a "wants to know" basis, while 2) restricting the American public's access to the same information.

On the positive side, many stories that would have previously gone unreported in the mainstream US media  may now get reported by these organizations.   On the other hand, because the circulation of controversial stories can be strictly limited by algorithm, the "informed few" will often not be aware that only a select group is privy to viewing a controversial headline.

The situation is somewhat analogous to an unintended consequence of the otherwise laudable American charter school movement.   When well-educated parents are most motivated to get their kids into charter schools or subsidized private institutions, these super-empowered citizens do not tend to advocate as hard for the improvement of public education overall. Consequently, whereas for a minority, educational opportunities and standards may improve, for the majority standards either don't change or decline.  Likewise, to the extent that the better informed segment of the public feels satisfied with the quality and content of their "personalized" news, I suspect they will be less likely to care about what media companies are feeding their less well-connected fellow citizens.  Few will notice, fewer will care.  

In the old days, when journalists and press advocates talked about the public's "right to know," acts of censorship were glaring and occasionally sparked outrage. In a future where news is freely distributed on the basis of "want to know" algorithms, who will stand up on behalf of the public's "right to know"?

1 comment:

  1. Excellent post and a crucial issue, your schools analogy is very apt, we could be creating information castes without people even knowing they are being routed and filtered.

    ReplyDelete

Because all comments on this blog are moderated, there will be some delay before your comment is approved.