Filter bubbles

Media bias and filter bubbles

For your homework today, you read laterally by looking at what others have said about one of the sources listed on the far left (blue) or far right (red) on AllSides. From this you were supposed to answer a few questions:

  • What did you discover from other sources? Did you trust this information you read?
  • What could you find about each of these sources in terms of the site’s process, expertise, and aim?
  • What makes consuming info from these sources a potential problem for our country?

After Trump was elected in November 2016, Saturday Night Live ran this satirical ad of a planned community called “The Bubble”:

The ad poked fun of the privileged position that hipsters, progressives, or white millennials can choose to close themselves off from a version of America that threatens their worldview (they jokingly call it “Brooklyn”). It’s a funny skit because it plays off some of the fundamental trouble with a networked view of reality.

The concept of the “bubble” actually draws from tech guru Eli Pariser, who gave this famous TED Talk on filter bubbles back in 2011 and published a book about them soon after:

Since Pariser’s talk 8 years ago, information has become even more filtered. For example, a 2014 Pew Study found that there are key differences between folks on the left and the right of the political spectrum (see image on the right). While Facebook is an echo chamber for consistent conservatives, consistent liberals are likely to de-friend people who disagree with them politically. Moreover, while consistent liberals have a more diverse media diet, conservatives distrust the news and hence, mostly stick to Fox News for their information.  At the same time, Pew found that there’s a huge cohort of people (46% of those) whose politics are more mixed; those folks tend to listen — rather than speak — when it comes to politics online; in other words, they tend to be influenced by their filter bubbles rather than contribute to them.

The mechanics of filter bubbles

So what specific technologies raised by Pariser drive filter bubbles? In other words, what makes them work? In assigned groups, take 10-15 minutes to read the Pariser quotation below and complete the related task. Be ready to share your findings with the class.

Group 1

“Even if you’re logged out, one engineer told me, there are 57 signals that Google looks at — everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located — that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore.”

Image result for Google Analytics

Task: Do a search for “Google Analytics” and see what this tool can do. What some of the features of Analytics that any web designer can use to track or influence readers?

Group 2

“There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized — different people get different things. Huffington Post, the Washington Post, the New York Times — all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.”

Task: Which of your online accounts have personalization settings? Which of these can you turn off? Do some poking around the web to see if you can find advice on this and be ready to share some tips with us.

Group 3

“…what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.

Image result for blue feed red feed

Task: Visit the site Blue Feed, Red Feed, play around with the filter bubbles, and read the methodology. Be ready to explain it to the class. 

Group 4

“What we’re seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important ”

Task: Read this piece Facebook wrote in 2015, telling the public that the company “…discovered that if people spend significantly more time on a particular story in News Feed than the majority of other stories they look at, this is a good sign that content was relevant to them.” What other factors do Facebook consider when calculating meaningful content? List them and tell us which of these give users some control. 

Image result for starred friends settings facebook

Homework for Thursday, 3/7

Read Caulfield, Chapters 20-25

Use WP Post #4 to go upstream on a news report that cites a recent study. If you’re stuck, type “recent study” into Google and click the “News” tab at the top. Like so:

Once you find a news story that cites a study or piece of scholarship, go upstream to find that original study. [Note: You may have to log in to Rowan’s library to access some of these.]

Even if you cannot find the actual study, use the strategies from the chapters above to check the credibility of the journal and the expertise of the author(s). If you get stuck on one strategy, discuss it in your post, but move to another. Not all journals will have an impact factor and not all authors can be easily found in Google Scholar, but you should seek both. Ultimately your goal is to use these search strategies to “accurately summarize the state of research and the consensus of experts in a given area, taking into account majority and significant minority views.”