The Filter Bubble

A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas.

– Eli Pariser

A filter bubble is a result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google Personalized Search results and Facebook’s personalized news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for “BP” and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were “strikingly different”. The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal and addressable.

. . . . .

Consider trying out the DuckDuckGo search engine and the uBlock Origin browser extension.

See Also

How Web Sites Vary Prices Based on Your Information

Is Google Making Us Stupid?

Facebook’s Political Influence Under a Microscope

Don’t Bubble Us

Don’t Track Us