Eli Pariser’s TED talk about the need to pay attention to the potential pernicious effects of “filter bubbles” hit me at a gut level. Even after reading arguments that offered alternate viewpoints about these initially self-imposed, but now increasingly externally-imposed online information cocoons, and after looking in detail at the concrete example of Wikipedia (which is in my view a major bulwark against bubbling), I still have to admit that I share Pariser’s concerns.
Before getting into it, I should probably note my own natural biases. While I have always been a technophile, an early adopter, and more recently, a grassroots-level tech evangelist, I have never been a programmer or a technical worker. What’s more, I graduated college before the Internet hit the mainstream (and the mainstream is still reeling from the blow). Lastly, I come from a family firmly rooted in the older traditions of information transmission – my father a veteran conflict correspondent for the world’s largest weekly, my mother a press officer for the UN, my sister a print journalist-turned-academic. So I guess it would be fair to say that my viewpoint is somewhat skewed.
That said, I believe that the concerns raised by Pariser are valid. For the uninitiated, Pariser’s argument is that search engines, Facebook and other websites are now using algorithms now which customize the information that shows on our screens on the basis of what the algorithms believe we will want to see (based on what data the algorithms have collected about our past web-behavior). Further, he feels that this customization is potentially dangerous for society, because it reinforces filter bubbles, i.e. 360-degree cocoons of the kind of information we want to see instead of the information we should see (a human tendency highlighted in Professor Todd Rogers’ behavioral science course at Harvard), and thus increases societal polarization and reduces our effectiveness as civic actors.
There is some evidence which lessens the impact of Pariser’s argument. In The Wikipedia Revolution: How a Bunch of Nobodies Created the World’s Greatest Encyclopedia, Andrew Lih prompts the realization that the web isn’t only about filter bubbles. Rather than creating online cocoons for like-minded individuals, Wikipedia’s model is built on the fundamental concept of enabling (if not forcing) many people of sometimes radically different viewpoints to face each other’s opinions and collaborate. On Wikipedia, one of the most important sites on the Internet, and which touches the lives of millions, partisans daily achieve consensus to produce accurate, neutral point of view entries – as Lih says, “Wikipedia encourages confrontation and challenge as a necessary part of converging on the truth.”
Additionally, in his article Are we stuck in filter bubbles? Here are five potential paths out, Jonathan Stray asks some pertinent questions relating to Pariser’s assertions. Among other things, Stray notes that our information streams may not have been all that great in the time before technology enabled automated mass-personalization – a fair point, as long as one accepts that trade-offs have been made, and one of those is in fact the polarization of the populace and degradation of its members as civic actors. He notes that better filtering algorithms must come – but I ask: until that day, how do we mitigate the effects of the ones we have now? Stray argues that if you map out online activity, you’ll see that the polarization has already happened. And he asks a valid question: when thinking about this issue, why draw the line at the American context and the ‘culture war’? Pariser’s TED talk doesn’t address this question at all.
It is important also to look at the historical context. American societal polarization, fueled by the ability for individuals to choose their own media input since the 1980s, is older than the public Internet. The telecom deregulation and the related cable television revolution began the fragmentation in mass US audiences. The polarization of radio broadcasters and mainstream print journalism outlets (as they were bought up by conglomerates) followed suit. So the problem in the US is bigger and older than the Web.
I believe that the Internet and related information technologies offer us both paths. On the one side, as Stray and the example of Wikipedia point out, people can always choose to leave their filter bubbles – technology gives us direct routes to leave our wants behind and face the rest of the world. But Pariser is right that we should be concerned, because as Stray’s mapping shows, the overwhelmingly vast majority of Internet users do not do that. And they never will. Far more importantly, in the cases of these new customization algorithms, people aren’t making the choices anymore. The algorithms are.