The Filter Bubble

The Filter Bubble

During the run-up of the American elections we all thought that Clinton would win. We were assured so by all the newspapers, polls and statistics. Trump didn’t have a chance, and Hillary would take the victory.

This did not happen.  The statistics were wrong.  But there was one more thing that was wrong: nobody saw it coming. Almost half the population of America voted for trump, yet I have not seen one pro-trump article in all my travels on the internet. I have not heard one person say they voted for Trump. All the writers I read seemed to make fun of him. All the videos I watched wrote Trump of as a joke.  Where did all those Trump-voters come from?

One possible answer for this dilemma is the filter bubble.

Image result for person in bubble

The idea is that from all the content the web offers you only see the small part that the search algorithms of facebook, google and other sites select for you. They make this selection based on the information they have about you. What type of articles you click and what kind of internet pages you visit. These algorithms determine what you like and then present you with more of the same.  So according to this explanation I did not see any pro-Trump messages in my feed because facebook predicted I probably wouldn’t like those very much and wouldn’t click on them.

The term ‘filter bubble’ was first coined by Eli Parisher (see his TED talk here).  According to Parisher, when he asked two of his friends to google BP (British Petroleum), they both got vastly different results. One friend got investment news, while the other got an article about the deepwater oil spill.

Another example of the filter bubble can be seen in the next picture. It is a cloud of all the tags on Instagram pertaining to the Israel-gaze conflict. The bigger the text, the more it appears, and the closer two tags are, the more they have appeared together.  Two completely different clouds appear, with almost no connections between them.

 

filter-bubble-tagcloud

(source: If you have time I encourage you to read this article – it is very good. )

The consequences of this phenomenon are immense. People will never encounter different opinions. Everybody is basically in his own little bubble, with his own opinions and view. The internet will be split up in such a way that you never realize there exists another side, with other people. Or you assume it is just a very small part of the population.  You are creating an artificial echo chamber.

 

filter-bubble-zuckerbergquote

Is it actually true, though?

Jacob Weisberg, a writer for the website Slate.com, felt that Parisher’s claims were way overboard.   The BP example was the only evidence that Parisher provided in his book, so Weisberg decided to do his own small investigation.  He asked five friends  with widely different backgrounds and political affiliations to google some steamy political keywords such as “Obamacare.”.

He found nothing. There were no major differences between all his friends.  And when he asked Google about this, their answer was as follows : “We actually have algorithms in place designed specifically to limit personalization and promote variety in the results page”.

(Read Weisberg’s original article here: again, a great article.)

Facebook responds

Facebook replied to the filter bubble allegations by actually executing a formal study, which was published in Science.  It included all facebook users that indicated their political preference in their profile.

Below is a graph taken from the study. It shows very clearly what actually is happening.

  • With ‘cross-cutting content’ they mean political news that opposes with the user’s view, expressed as a percentage of the total political news.
  • ‘ Random’  is the percentage of all political news on facebook if everyone saw the same.
  • ‘Potential from network’  is the percentage of those posted by your friends.
  • ‘Exposed’ is the percentage they actually saw
  •  ‘Selected’ is the percentage they clicked on.

One thing immediately grabs the attention. The friends you have are way more important for that which you see than the information Facebook has about what you like.

filter-bubble-facebookgraph

The real problem: Facebook should act like a newsfeed. 

So – does facebook actually contribute to the filter bubble problem? Only marginally.

But I can hear you argue – Facebook selects posts for you based on what you friends do , so it’s still kinda their fault, right?

That depends on whether you look at Facebook as a social network (as they claim to be) or as a news feed.  A social network is by nature a biased thing. People have the tendency to have friends with the same ideological beliefs.  So in fact, Facebook is just the digital version of the ‘ real life filter bubble’ .

But when we look at facebook as a news feed, things are different. Newspapers have a duty to be as objective as possible. Though this is hard to put in practice, and almost impossible to do perfectly – they should strive to do so.

We actually have had this problem before – and its hardly unique to the digital age. It was called ‘ Yellow Journalism. ‘    The first sentence of Wikipedia about this phenomena goes as follows:  ‘Yellow journalism, or the yellow press, is a type of journalism that presents little or no legitimate well-researched news and instead uses eye-catching headlines to sell more newspapers’

Sounds familiar?

Eventually the problem was recognized and newspapers realized they had to try and be objective –  to shy away from sensationalism and give people the facts. Journalistic ethics were born.

Conclusion

The concept of democracy only works when citizens are well-informed. The filter bubble is a dangerous thing but it’s not a recent problem. Confirmation bias has long been recognized as a problem. People prefer their own version of the truth  – and aren’t very good at recognizing and remembering opposing views. On top of that we are often surrounded by people from similar backgrounds and similar beliefs. A ‘ real life’  filter bubble. Even if the digital version is not yet as pervasive, it is still an additional weight to an ancient problem. The most important  step in preventing the filter bubble from actually happening in the future is Facebook, Google and all the other sites starting to acknowledge they have a duty to give objective information to their users.

Leave a Comment

Your email address will not be published. Required fields are marked *