Feeds are everywhere today, from social media to law enforcement. How might we think differently about society if we understood feeds more clearly?

Speaking today at the Cornell Department of Communication is Christian Sandvig (@niftyc) the Marshall McLuhan Professor of Digital Media at the University of Michigan. He’s also director of ESC, the Center Ethics, Society, and Computing. I was able to liveblog Christian’s talk, which gives a preview of an upcoming book he’s writing.

while data collection and algorithms have made feeds attractive, they’re not inevitable

What comes to mind when you think about a digital feed? Christian opens up by telling us about Safebook, a browser plugin by Ben Grosser that promises to “make Facebook content safe.” Ben’s plugin does this by removing all content from Facebook. While Safebook draws our attention to Facebook’s News Feed, feeds are actually everywhere.

Ben Grosser’s Safebook removes all content from Facebook

Before Facebook introduced its News Feed, people had to visit the pages for each friend to find out what they were doing. This idea was so unpopular that college students started a protest movement. Mark Zuckerberg argued that they were just rearranging information on the site, but college students argued that centralizing information made Facebook more creepy, as if people were spying on their friends.

People often believe that the “like” was added to Facebook because the company thought it would be an interesting way for people to express themselves online. Unlike the Facebook “poke,” the “like” was actually created by the company’s engineers in order to collect information about what to prioritize on the feed, Christian tells us.

A feed is a constant stream of information that is automatically curated

The Facebook News Feed, like any feed, is a constant stream of information that is curated in some way. Developments about computing technology have made it easier for computers to make automatic decisions about what we see. The idea of automated curation of continuous information has now appeared everywhere.

Originally, the Facebook algorithm was barely worth calling an algorithm- it was everything your friends did in reverse chronological order. The feed was originally based purely on likes, and now it’s a complex trade secret that employs “hundreds of factors” including the network of relationships, how long your eyes spend on something, whether your friends are involved in the conversation, and others.

Seeing Feeds Everywhere

Christian next shows us art from the “Natural History” series by Jenny Odell (2015). Odell’s art reminds us that social media feeds are now everywhere.

Consider email for example. We used to think that email was like postal mail, where we would see every piece of mail that arrived. That changed with spam filters, which converted email to a feed by having machines decide what mail we would see. Since then, mail clients have evolved to work more like a feed. Gmail, for example, uses algorithms to “filter up” information that’s important and send different emails to different tabs. Just like Facebook, Gmail is paying attention to what you look at, what you click on, and what you reply to– and using it to prioritize what you see first.

Christian next tells us about search engines. When DEC developed Alta Vista in 1995, researchers thought it would be impossible to create a somewhat useful search engine. If you used Alta Vista, you might have to scroll through many pages of results. As search engines improved, they started to look more like a feed. Now, when you click on things on Google, their search engine will incorporate information about people’s behavior into its recommendations. Now searches are personalized, with results that can change quickly over time.

Christian Sandvig demonstrates the “Digital Stakeout Scout,” a social media surveillance tool used by police departments in the U.S.

How far can we take the idea that the feed has escaped Facebook to become more widespread? Predictive policing algorithms and social media surveillance systems also use feeds. Christian shows us a screenshot of the “Digital Stakeout Scout,” a system that opens up tabs entitled “Terrorists” and other things. The system shows a real-time feed of people who the system thinks could be terrorists. Similarly computer vision security systems are showing guards a queue of things that the computer things are interesting. Again, certain things are prioritized, with tabs, stars, and an opportunity to explore more– using the genre of the feed for security.

The feed as a genre of information

What does it mean for the newsfeed to be a genre of information, perhaps as important as the novel? Rather than give us a specific definition of a feed, he encourages us to think about it as a loose genre. What do we gain from thinking about information as a feed as opposed to some other way?  To answer this question, Christian raises some common objections:

Is the idea of the feed purely descriptive, without giving us a normative idea about the world we want to see? People object that feeds are the inevitable unfolding of growth in data collection and digital technology. Maybe they’re just the natural outcome of the world we live in?

Feeds aren’t inevitable, Christian says. He points out that in his empirical research with Motahhare Eslami and others, they found that people prefer different kinds of algorithms: some people like random order, some people like reverse chronological order. Others prefer to keep their list of friends small so they can see everyone’s activities. Perhaps a platform could exist that served these people and involved less data profiling, surveillance, or automated curation. Even if we accept the feed, we can imagine a range of different kinds of feeds within the genre.

what if companies’ interests aren’t aligned with users?

Christian also introduces the idea of “corrupt personalization” by asking us: “what if companies’ interests aren’t aligned with users?” Many people argue that companies have an incentive to meet user needs, so they would be unlikely to do things that users wouldn’t want. Christian points out that personalization often happens under circumstances with multiple acceptable options. But if people don’t know the alternatives, they might still be satisfied with options that aren’t in their best interests.

To illustrate this, Christian mentions that Google argued for many years in public statements that it never manually intervened in search results. But in a European antitrust case, it was shown that the company was altering their results to promote Google products. At one point in Facebook’s history, the company chose to promote updates from friends and family if those people mentioned products. We can’t assume that the interests of companies are aligned with the public.

Why should we care about this? And why do genres matter? Scholars like Walter Ong have argued that the novel was so significant for society that it created the idea of the individual. Reading the novel was a way to experience the interior voice of another person. This is just one argument for the importance of a genre in society. What is the equivalent argument for the feed? 

How are feeds shaping society?

Social Soul by Lauren Lee McCarthy and Kyle McDonald

Christian next shows us Social Soul by Lauren Lee McCarthy and Kyle McDonald. The artwork surrounds people with personalized media to imagine what it might be like to be inside the newsfeed.

To illustrate the value of the idea of the feed as a genre, Christian tells us about Shoshana Zuboff’s book Surveillance Capitalism. Zuboff argues that people’s attention is being shaped and harvested by companies that create feeds. But not all feeds are optimized to hold onto our attention. If you think of the feed as a genre of information, maybe Zuboff’s book is just about a few companies rather than a deeper development in capitalism.

Next, Christian points out that many researchers have observed that the workings of a system become hard to understand after a certain level of complexity. One group of computer scientists says it’s terrible that we don’t understand how they make decisions. Other researchers say that since the algorithms are so complicated, we shouldn’t use them. Christian says that as we use feeds for more and more things in society, we’re going to face this dilemma across more parts of society.

by using machine learning and feeds, humans are outsourcing knowing to machines

Ian Hacking famously came up with a typology of knowledge in science, says Christian. For example, Hacking pointed out that the mathematical proof is a different way of knowing something than an experiment. Maybe feeds involve a different form of knowledge. By using machine learning and feeds, humans are outsourcing knowing to machines. Even if we are able to take actions based on these systems, we don’t necessarily have to understand what the computer knows. We can do follow-up to test the system, but we no longer take it as our job to know or act– the computer acts, and that’s fine.

Once we can recognize feeds, we start to see them everywhere, Christian says. Yet while data collection and algorithms have made feeds attractive, they’re not inevitable. We still have many questions to debate about privacy, influence, and governance of feeds, as well as decisions to make about where we do or do not want feeds.