Despite evidence that only a minority of Facebook users realize their news feeds are guided by an algorithm, there are still plenty of home grown ideas about how posts are chosen to appear.
Facebook, of course, won’t tell us precisely how the feed works.
But researchers have gathered 10 ”folk theories” of the mysterious algorithm in a paper describing a study of 40 users. The seven authors, hailing from the University of Illinois at Urbana-Champagne, the University of Michigan, and California State University, Fresno, titled their paper “First I “like” it, then I hide it: Folk Theories of Social Feeds.”
Folk theories are everyday concepts of how the world works that circulate informally. Unsurprisingly, they often diverge from the truth. But that doesn’t mean they’re worthless. Crucially, they can give designers a window into users’ minds, providing insights to make their services and tools better suited to real-world use.
The researchers reference a 1986 study of folk knowledge (pdf) around home-heating thermostats: The “feedback theory” says a thermostat is like a switch, turning on or off to maintain a temperature, while the “valve theory” holds that they must be adjusted more like taps, with heat “flowing” from them.
Although the feedback theory believers are closer to the technically correct explanation, that doesn’t stop valve theorists from reaping greater benefits from their thermostat use in some cases, including lower energy use.
“A theory that is useful for designing thermostats is not guaranteed to be a good theory for using them,” wrote the study’s author, Willett Kempton of Michigan State University.
The Facebook experiment involved first filtering the participants into two groups. One group, containing less than half of the participants, was aware that an algorithm decides the items in their news feed; the others were not.
Both groups used a tool the researchers created called FeedVis. It displayed their news feed alongside every post their friends had written, allowing them a sense of which posts Facebook’s algorithm highlighted for them to see. It also showed them how frequently various friends’ posts appeared in their feeds.
When asked how they believed the algorithm worked, the users described a variety of theories:
The Personal Engagement Theory. ”The more interactions that you have with somebody, the more their stuff will show up on your News Feed.” One user also described this tactic: “When I ‘like’ something, I usually hide it from my News Feed because I like it but I don’t necessarily want to know all about it all the time.”
The Global Popularity Theory. “The more people that click that they like [a story], the more people that comment, the more people get to see it.”
The Format Theory. “This is what I found: If you just do a written post, just words, it reaches more people. As soon as you put a video or photo attached, they cut down how many people are going to see it”.
The Narcissus Theory. “Maybe if we’re from the same group, like the rugby people, I see more from [them].”
The OC (original content) Theory. Content gets distributed more widely “when you upload your own photo versus just sharing another photo from another Facebook page.”
The Control Panel Theory. “I’ve hidden things like this before, like the daily horoscope and things like that, so maybe that’s why these types of things don’t show up on my feed.”
The Theory of Loud and Quiet Friends. “Time definitely has something to do with it. If they’re going to post every single post onto someone’s News Feed, someone could use that to their advantage and literally post the same letter over and over and over again to bury someone else’s message. Or they could possibly spam a message of […] like ‘free the penguins’ and just copy and paste, enter, copy and paste, enter, copy and paste, enter, and that’s going to absolutely fill up whoever is on that list.”
The Eye of Providence Theory. According to the researchers, participants who posited this theory thought that Facebook was “powerful, perceptive, and ultimately unknowable,” like an all-seeing eye. One user said that seeing friends’ ”very liberal posts” meant that Facebook could perceive his own political leanings.
The Fresh Blood Theory. “He recently friended me so maybe that’s why [he’s in the ‘mostly shown’ column]”
The Randomness Theory. “I’m guessing it only sends out 20% of [a friend’s] posts, and maybe it just randomly selects which ones.”
In the final phase of the experiment, the participants were asked how they would gain control over their feeds, given their theories of the algorithm. Faced with this task, users only elected to work with two theories: Personal Engagement Theory and Control Panel Theory. Eight participants, or 20% of the group, could think of no way to modify their feeds, based on tasks assigned by the researchers.
That’s a troubling conclusion for users who want some agency over their feeds. One participant, asked how she would instruct Facebook to display a story written by her cousin on her feed, was obviously frustrated.
“If I understood how Facebook chose to do what they did, then I would know what I needed to do, but since I don’t know what that is, I don’t know,” she told the researchers. “If they told me what to do, I would do it!”