A summary of Eli Pariser’s "The Filter Bubble"

I recently read Eli Pariser's new book, The Filter Bubble. Unlike most books that I read, I felt like summarizing it. Here is my summary. It is very condensed and will probably be most useful if you are already more or less familiar with the ideas in the book. The rest of this post is my summary, and does not necessarily reflect my own opinions. Let me know if this kind of summary is useful to you.

Personalization is on the rise: Google showing personalized search results marked the beginning of the era of personalization. Google is not alone: Facebook, Yahoo, Microsoft and others all use personalization. This trend is rising and the big companies predict that practically all information will be personalized in the near future.

The filter bubble is the universe of information individualized for each one of us. It is completely individual, transparent and out of our control. The book aims to explore the implications of the filter bubble, mostly focusing on the risks, and to make them and the bubble itself visible.

Through decreased exposure to unfiltered information, the filter bubble reduces creativity and learning. It gives big companies sensitive information about us and the power to control what we are exposed to and therefore take away some of our freedom. It makes it easy to avoid involvement in global affairs.

Explicit intelligent agents have been around for over a decade but failed. Now they are still there, but hidden. Amazon was the first successful and big recommendation system. It showed the power of relevance. Then Google came and although they grew through their search engine, they started collecting massive amounts of personal information through free services that require logging in (which also helped compete with Microsoft). This information is used to personalize search results but also ads. Facebook grew with a different focus, the social interactions. Both are trying to lock users in so they don't switch to competitors. Other, less visible but even bigger companies like Acxiom, BlueKai, TargusInfo and the Rubikon project are silently collecting massive amounts of information about everyone. These internet giants allow more personalization than ever before, and there are important implications.

The power of classic newspapers is rapidly decreasing. News becomes fragmented. Traditional newspapers were supposed to neutrally inform the public. The filter bubble doesn't.

Disintermediation of news means removal of the middleman – the newspaper. Information is available in many other ways. Overpersonalization of news can exclude important stories. People trust random bloggers almost as much as they trust the New York Times, and trust is moving from centralized information sources to personalized ones. This reduces exposure to different opinions and reduces our active effort to acquire information.

With personalization, the best stories are those that appeal to most people, and these tend to be superficial and target the lowest common denominator. Important news get ignored because of its low appeal.

We are biased to believe what we see, and filters affect what we see. They reinforce what we already believe, and hide different opinions. Therefore they dramatically amplify our existing confirmation biases. They also reduce the confusion which is inherent in the world, and since such confusion drives us to seek new information, they make us lazily stick to our existing beliefs.

Filters promote narrow focus similar to stimulants like Adderall. They limit creativity by limiting both the exposure to new information and the motivation to seek it. The reduced variation reduces serendipity. Creativity requires some diversity. With filters we are not even aware of the existence of everything which is filtered away, and we don't know how narrow our vision becomes.

Filters assume wrongly that we have a unique identity. Facebook forces us to be the same with everyone.

Google personalizes based on behavior (searches, clicks). Facebook personalizes based on how we present ourselves (posts, comments). These private and public profiles are different and both are incomplete.

We have "should" interests, which we think we should hold, and "want" interests, which are what we actually consume. Good media mixes should with want, but click-based filters are biased to "want".

The personal information companies collect about us allows them to use personalized persuasion methods and exploit our weaknesses.

Filters lock us in local maxima by reinforcing even weak or arbitrary preferences through positive feedback loops ("identity cascades"). This actually affects our thinking. When the internet thinks we are dumb, we get dumber.

Overfitting algorithms can make us more discriminatory. Netflix may be an example of stereotyping. To prevent this, algorithms need to continuously test their models, but they rarely do. Algorithmic induction can lead to information determinism, where past clicks decide the future.

Personalization gives dangerous power to few companies, which can be exploited for political ends. Google try to do no evil, but they certainly could if they wanted to.

Filters create the friendliness illusion by making unpleasant information invisible.

Personalized political ads fragment campaigns so that no one really knows what is being shown to whom, destroying public debate about important national and global affairs.

Postmaterialism refers to emphasis of identity over basic needs. It combines with the filter bubble to reduce overlaps in content and erode shared experiences.

Democracy only works if citizens are capable of thinking beyond their narrow self interests, and this requires a shared view of the world.

Programmers sometimes shun politics, preferring the world they can create in code over real world influence. Google and Facebook don't consider moral roles often. Programmers and engineers have much power today to shape not just technology but also society. Building an informed and engaged citizenry requires engineers to go beyond not doing evil, to do good.

The filter bubble now expands outside our computers and into reality. Huge amounts of data can easily turn to artificial intelligence. Product ads enter blockbuster movies and bestselling books. Smart devices and augmented reality allow more personalization of reality. And filtering now extends to people too. Machine driven systems take away from our freedom and control, and they don't work perfectly.

We need to avoid falling into the bubble trap. Opt out of personalized ads, erase cookies, use sites that give more control and transparency over filtering like Twitter and unlike Facebook. Become algorithm literate. Companies should also help by making their algorithms more transparent and encouraging political engagement and citizenship. An "important" button can be added to "like" so people themselves can collaboratively filter what's important. Systems should also promote serendipity by exposing users to more content outside their narrow interests.

Governments can also help by regulations to give users more control over data collected about them and how it is used. Personal data should be considered as personal property.
Advertisements

One thought on “A summary of Eli Pariser’s "The Filter Bubble"

  1. Pingback: Interactivity Project, General Research – Destin Black

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s