"By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few." -- Mark Zuckerberg, Facebook Founder and CEO, 2012.
Facebook, the world's most popular social media site, with 1.65 billion active users, has immense power. Now, according to an expose in Gizmodo, it has been caught actively exercising that power. On Monday, Gizmodo charged that former Facebook workers routinely manipulated news feeds to leave out conservative stories. They also injected stories into news feeds that were not trending, and sidelined news about Facebook itself. These charges ought to deeply trouble us all, even if, as progressives, we might like the idea of a built-in bias against conservative news.
As a journalist, I rely heavily on my vast network of colleagues around the world to alert me to breaking stories. Increasingly I find news hooks for stories, and even sources, through Facebook. The website is a powerful tool in my arsenal. But what if that tool is being perverted? On the one hand, Facebook claims it is a social tool, created to help people share information directly with one another. On the other hand, it is also allegedly trying to act as a filter of information, much like traditional media outlets have done.
Early in its evolution, Facebook simply showed you a "news feed" of your friends' posts in linear order as they updated their statuses. Somewhere along the way, Facebook decided to "curate" that news feed. Slowly but surely, users began to see a skewed version of their friends' status updates based on secret algorithms the company created in order to show you what it thinks you want to see. That took Facebook down an insidious path that led to Gizmodo's charges of manipulation of the "Trending" news section. After all, Facebook was already using proprietary algorithms to shape your feed. Now, it turns out, some employees may have been shaping the news to their personal liking.
In an interview on "Rising Up With Sonali," Robert Jensen, a journalism professor at the University of Texas at Austin, made an important point: "It's time we stopped calling these companies 'social media companies,' as if they are a kind of unique corporate entity." Rather, said Jensen, "It's time to start calling Facebook 'corporate media.' " Gizmodo's Michael Nunez, who broke the original story, made a similar point. "Facebook's news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation," he said.
Jensen added, "When Facebook and other tech companies claim to be neutral because they're running these algorithms and they've taken human judgment out of it, that's no more of a coherent claim than when The New York Times claims to be neutral."
It is difficult to convince ourselves that Facebook is on a par with traditional corporate media, simply because most of what we consume on the site is content that our friends generate. But in reality, Facebook's goal, like all profit-based corporations, is to make as much money as possible. To that end, it is crucial that the company sell as many eyeballs to advertisers as possible, just like traditional corporate media. Whether the alleged manipulation of news feeds was geared toward that end is as yet unknown. Still, it is worth reminding ourselves that what we are exposed to when we use the website may be slanted.
We are familiar with the idea of television reporters and print journalists carrying their unique biases into their work, whether they admit to it or not. But who are their counterparts at Facebook? According to Nunez, they are "a small group of young journalists, primarily educated at Ivy League or private East Coast universities, who curate the 'trending' module on the upper right-hand corner of the site." These so-called curators are part of Facebook CEO Mark Zuckerberg's stated plan to make his company "the primary news experience people have."
Zuckerberg's utopian vision of Facebook--"a strong company with a strong economic engine and strong growth can be the best way to align many people to solve important problems"--is in line with the new generation of Silicon Valley technocrats who see democracy and freedom intimately wrapped up in the entitlement of wealthy elites to make as much money as possible. In other words, if you knock capitalism, you're knocking democracy and freedom, and if you want to promote democracy and freedom, you should nurture the desires of Zuckerberg to maximize profits, because of, well ... freedom and democracy.
Unfortunately for Facebook's wunderkind, a majority of young Americans are now skeptical of capitalism. After all, our current economic model has hardly benefitted people under the age of 30, growing numbers of whom are more educated than ever but are more burdened by debt and less able to acquire decent-paying jobs than ever. If we see Facebook as a central part of the selfish economy (rather than the "sharing economy," as techies like to call it), interested in users only as sources of income and manipulating the news we read, we might start to deeply question the outsize role that such companies play in our lives. We might start to see democracy and capitalism as antithetical.
In its response to the Gizmodo article, Facebook openly admitted its most insidious goal: "We will also keep looking into any questions about Trending Topics to ensure that people are matched with the stories that are predicted to be the most interesting to them, and to be sure that our methods are as neutral and effective as possible."
Think about the contradictions in that statement. The most neutral filter would be no filter at all, but Facebook is intent on showing users the news that it thinks we want to see. Somehow the algorithms the company creates are supposed to take human bias out of the selection process so that the stories we are shown appear to be "naturally" chosen for us. But if you get your news from a variety of sources, you are much more likely to be exposed to perspectives that might challenge you, or with which you might disagree. With increasing numbers of people relying on Facebook as their sole source of news, and Facebook in turn only exposing them only to stories within their comfort zone, are we not creating a highly polarized world in which everyone is convinced that their perspective is the popular--maybe even the "correct"--one?
Of course, filtering is always going on. When I choose stories to cover on my progressive radio and television news show, I am filtering the news for my listeners and viewers. "The world is too complex for us to have direct access to all of the news," said Jensen. Rather than demanding "unfiltered information," he said, we should be asking, "Who's doing the filtering and on what principles?"
Facebook does not answer those questions (my listeners and viewers know off the bat that the news they get on my show is viewed through my progressive lens). Instead, it feeds us vague-sounding pabulum about making "the world more open and connected," while behind the scenes, highly educated, nameless, young people may be using secret algorithms to tweak our news this way or that. Rather than being open and connected, the company may well be fostering less openness and transparency and potentially greater political polarization.
In response to Gizmodo's charges about Facebook, the U.S. Senate Committee on Commerce, Science and Transportation has opened an inquiry into its practices. Of course, chances are that if Facebook had been charged with undermining progressive rather than conservative news, the GOP-dominated Senate would have ignored the story altogether. Still, as Jensen says, "If Facebook is doing something that it claims has a public benefit--they're not just selling shoes; they're selling us an experience that has to do with citizenship--there's a demand for transparency that we should be making."
Given that nearly a quarter of the global population actively uses Facebook, how might ordinary people wield our power to demand transparency? After all, Facebook is merely an empty shell without our online presence and interactions. Users have an inordinate potential for leverage over the company, even more so than we have over traditional media, given the interactive nature of the beast. If we start to think of Zuckerberg's creation as a crucial tool that truly fosters democracy, perhaps we should perceive it as an electronic commons and reduce the demands of a capitalist, money-making model to a secondary or even oppositional role.
As Zuckerberg himself presciently said, "These voices will increase in number and volume. They cannot be ignored." Even he knows that the power to make demands of Facebook is in our hands.