Our main news source customizes what it tells us to confirm what we already believe. It hides conflicting information from us so that we are confident in thinking that others who believe differently than us are stupid or evil. Our main news source, of course, is computer algorithms that learn what we like and serve us more while hiding things we don’t like. Here is an explanation that appeared as the 18 December 2014 excerpt from delanceyplace.com. -Tom Denham
From The Filter Bubble by Eli Pariser. Because of the personalization of the internet, an internet search of the same term by two different people will often bring very different results. We are each increasingly being served not only ads for what we are more likely to want, but also news and information that is familiar and confirms our beliefs. The issue is that we are increasingly unaware of what is being filtered out and why — leaving us each more and more in our own unique and self-reinforcing information bubble. Author Eli Pariser calls this ‘the filter bubble’ — and it is leaving less room for encounters with unexpected ideas:
“Most of us assume that when we ‘google’ a term, we all see the same results — the ones that the company’s famous Page Rank algorithm suggests are the most authoritative based on other pages’ links. But since December 2009, this is no longer true. Now you get the result that Google’s algorithm suggests is best for you in particular — and someone else may see something entirely different. In other words, there is no standard Google anymore.
“It’s not hard to see this difference in action. In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term ‘BP.’ They’re pretty similar — educated white left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page of results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP.
“Even the number of results returned by Google differed — about 180 million results for one friend and 139 million for the other. If the results were that different for these two progressive East Coast women, imagine how different they would be for my friends and, say, an elderly Republican in Texas (or, for that matter, a businessman in Japan).
“With Google personalized for everyone, the query ‘stem cells’ might produce diametrically opposed results for scientists who support stem cell research and activists who oppose it. ‘Proof of climate change’ might turn up different results for an environmental activist and an oil company executive. In polls, a huge majority of us assume search engines are unbiased. But that may be just because they’re increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click. …
“For a time, it seemed that the Internet was going to entirely redemocratize society. Bloggers and citizen journalists would single-handedly rebuild the public media. Politicians would be able to run only with a broad base of support from small, everyday donors. Local governments would become more transparent and accountable to their citizens. And yet the era of civic connection I dreamed about hasn’t come. Democracy requires citizens to see things from one another’s point of view, but instead we’re more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes.
“My sense of unease crystallized when I noticed that my conservative friends had disappeared from my Facebook page. Politically, I lean to the left, but I like to hear what conservatives are thinking, and I’ve gone out of my way to befriend a few and add them as Facebook connections. I wanted to see what links they’d post, read their comments, and learn a bit from them.
“But their links never turned up in my Top News feed. Facebook was apparently doing the math and noticing that I was still clicking my progressive friends’ links more than my conservative friends’ — and links to the latest Lady Gaga videos more than either. So no conservative links for me.
“I started doing some research, trying to understand how Facebook was deciding what to show me and what to hide. As it turned out, Facebook wasn’t alone.
“With little notice or fanfare, the digital world is fundamentally changing. What was once an anonymous medium where anyone could be anyone — where, in the words of the famous New Yorker cartoon, nobody knows you’re a dog — is now a tool for soliciting and analyzing our personal data. According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like ‘depression’ on Dictionary.com, and the site [automatically collects and stores information about your computer or mobile device and your activities] so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open — even for an instant — a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads. The new Internet doesn’t just know you’re a dog; it knows your breed and wants to sell you a bowl of premium kibble.”
The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think
Author: Eli Pariser
Publisher: Penguin Books
Copyright 2011 by Eli Pariser
Pages 2-3, 5-7
Recent Comments