Filter Bubbles

Filter Bubbles
One of the dangers of Personalized Searching is the creation of a filter bubble, in which the user experiences a web that is tailored to their interests, exposing them only to what they already agree with and filtering out the rest. This will cause people with different interests and beliefs to experience different versions of the web, and become sheltered by information that does not oppose their world view. It is proposed that this will cause greater divides between people with differing beliefs

Three Dynamics
 Users are alone in their bubbles. While there are others who have similar interests, no one has the same bubble. Therefore, it is pulling us apart.  The bubble is invisible. When a person goes to a liberal or conservative news source, they tend to be aware of it. However, “Google doesn’t tell you who it thinks you are or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong—and you might not even know it’s making assumptions about you in the first place.”  Users don’t have a choice about being a filter bubble. Once again, users typically go to biased news sources, but in a filtered Web, the content comes to them.  

Dangers of Filter Bubbles
Perhaps the biggest danger of Personalized Searching is not the fact that these filter bubbles are created, but rather that people seem to like this new version of the web. Participants in this study claim they would have ignored these results anyway, seeing as though the algorithm is based on their own search history, rather than some other person enforcing an agenda. Though people do attempt to break away from these filter bubbles, research has shown that people mostly seek out opinions that oppose their own to find flaws in these opinions. They seek out this information only to further enforce what they already believe, rather that to entertain or further understand a different point of view. Though they are exposing themselves to the information that would be normally filtered out of their search, they are actively reinforcing the ideological gap that is feared to be created by these searches. They seek out this information with the preconceived belief that it is incorrect and that their own view on the world is the correct one.

As Pariser notes then speculates: “Left to their own devices, personalization filters serve up a kind of invisible auto propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown...,” and that, “It’s not just serendipity that’s at risk. By definition, a world constructed from the familiar is a world in which there’s nothing to learn.”