Today we are continuing to talk about filter bubbles; we’ll then see how you did going upstream and reading laterally on recent studies.
Filter bubble follow up
On Monday you compiled common stories from news feeds from 3 different filter bubbles: conservative, liberal, and mainstream. Let’s look at them.
- What headlines and stories cut across these three bubbles?
- What stories seem unique to those bubbles? Do they draw from different sources—interviews, studies, unnamed sources, etc.? How do they link to other sources or stories?
- Are there any claims or supposed facts in these stories that are just begging for a more detailed fact-check? Which ones and why? How might you do about it?
Watch this TED Talk by Eli Pariser on filter bubbles.

Since Pariser’s talk 7 years ago, information has become even more filtered. For example, a 2014 Pew Study found that there are key differences between folks on the left and the right of the political spectrum (see image on the right). While Facebook is an echo chamber for consistent conservatives, consistent liberals are likely to defriend people who disagree with them politically. Moreover, while consistent liberals have a more diverse media diet, conservatives distrust the news and hence, mostly stick to Fox News for their information. At the same time, Pew found that there’s a wide cohort of people (46% of those) whose politics are more mixed; those folks tend to listen — rather than speak — when it comes to politics online; in other words, they tend to be influenced by their filter bubbles rather than contribute to them.
The mechanics of filter bubbles
So what specific technologies raised by Pariser drive filter bubbles? In other words, what makes them work? In assigned groups, take 10-15 minutes to read the Pariser quotation below and complete the related task. Be ready to share your findings with the class.
Group 1
“Even if you’re logged out, one engineer told me, there are 57 signals that Google looks at — everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located — that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore.”
Task: Do a search for “Google Analytics” and see what this tool can do. What some of the features of Analytics that any web designer can use to track or influence readers?

Group 2
“There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized — different people get different things. Huffington Post, the Washington Post, the New York Times — all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.”
Task: Which of your online accounts have personalization settings? Which of these can you turn off? Do some poking around the web to see if you can find advice on this and be ready to share some tips with us.

Group 3
“…what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.
Task: Visit the site Blue Feed, Red Feed, play around with the filter bubbles, and read the methodology. Be ready to explain it to the class.

Group 4
“What we’re seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important ”
Task: Read this piece Facebook wrote in 2015, telling the public that the company “…discovered that if people spend significantly more time on a particular story in News Feed than the majority of other stories they look at, this is a good sign that content was relevant to them.” What other factors do Facebook consider when calculating meaningful content? List them and tell us which of these give users some control.

Homework for Monday, 2/12
- Browse the “Fieldguide” section of Caulfield (look in the Table of Contents from the main page) and skim sections that interest you.
- In your 5th and final WordPress post, briefly propose 3 different possible claims to fact-check for your final Truth-o-meter post. This is the post that is worth 50% of your unit grade — re-read the assignment page for details and look at my example.
- Use a numbered list format in WordPress to separate these 3 claims and write a paragraph (¶) for each that makes a case for why it would be a good choice for a longer, ~1,000-word post. You might consider re-reading the first row of the rubric, as well as conduct some preliminary fact-checking to find out if your claim will be a good choice for this assignment. In other words, this is your chance to do the preliminary research that will help you be successful. We’ll share these with each other in class on Monday. Do you feel like you have good places to look for potential claims?