By Adrienne LaFrance
One of the vexing things about the way algorithms rule the Internet is that it’s impossible to glean they’re there without knowing exactly what they’re doing. The outcome of algorithmic sorting is only partly clear—you know what you see, you don’t know what you don’t see. And the logic of that information filtering is hidden.
What if, instead of relying on platforms like Facebook to invisibly turn the digital knobs that make each person’s News Feed look different, Facebook users could subscribe to different algorithms—formulas designed by humans who could explain their approach?
Publishing has long been about filtering information and bundling content, but it was human editors choosing story packages and placing them on websites or paper. But the specific stories were the body to the publication’s spirit. Subscribing to an algorithm would be subscribing to the sensibility of a publication, or as much of it as could be transmitted by algorithmic filtering.
“You just have a couple players that are generating massive amounts of content at scale—or in the case of Facebook, controlling massive amounts of content,” said Nicholas Diakopoulos, an assistant professor at the University of Maryland who studies algorithmic accountability. “But if we have a more plug-and-play approach, maybe we can allow smaller algorithm-purveyors to flourish.”
The idea is to combine the relative transparency of human-made editorial decisions with the precision of algorithmic ones. Imagine, Diakopoulos says, if anyone could write her own News Feed algorithm—and if those who didn’t want to write their own could subscribe to the ones others created. This sort of architecture would enable people to choose at least some of the sensibilities driving what they see on social platforms, rather than being totally at the mercy of an algorithm that Facebook engineers tweak for all kinds of reasons—including those known and unknown to users.
“What that then enables is a marketplace to develop where different people can have their own algorithm running on Facebook,” he says. “Then I can say, ‘Hey, I want The Atlantic‘s News Feed algorithm, or the New Yorker‘s News Feed algorithm, or I want Bob So-and-So’s News Feed algorithm. All of the sudden it becomes a subscription model where I have a choice.”
Just the act of choosing is a way for people to better understand why they see the information they see. Because, for now, it’s hard to tell how much of what a person sees on Facebook represents what’s actually being published or whether it reflects how that person tends to interact with information on Facebook and elsewhere online. The classic example: Liking baby photos on Facebook probably means Facebook’s algorithm serves up more baby photos.
“Algorithms make editorial choices, right? Editors at The Atlantic or elsewhere have been making editorial choices for a long long time,” Diakopoulos said. “Now we’re asking those algorithms to start making editorial choices. And I think the challenge is—in the past, the editorial choices of the human editor were visible and inspectable. There was a trace.”
That trace, for example, could be visible in a publication’s past coverage. If one newspaper repeatedly uses the term “undocumented immigrants” and another always writes “illegal aliens,” readers can infer some of the thinking and values behind the way information has been filtered. In already-established media formats like print, there are well-understood constraints that cue to readers why they’re seeing, say, what’s on the front page of a newspaper—like the timing of the event being reported, the physical space allowable on the page, the newsworthiness as it pertains to a large and potentially diverse audience of readers.
Understanding why a Facebook News Feed looks the way it does is more complicated. Every individual has a unique algorithm-driven experience that’s determined not just by the publishing choices of the 1 billion users who make the site’s content, but through integration with marketing materials, based on a user’s web behaviors outside of Facebook, and so on. And though it’s hard to imagine Facebook wanting to relinquish its algorithmic power, it isn’t totally out of the question.
“What they really want to control is the ability to show you ads,” Diakopoulos told me. “You’d have to convince them, but as long as there was some possibility to integrate advertising into your algorithms platform … That could actually give them lots of different kinds of audiences. There are certainly benefits to control. And there would be very real engineering challenges. But, you know, Apple did it. They created a marketplace of apps.”