You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

The Case Against Facebook’s Neutrality

How the social network's laissez-faire ideology plunged it into crisis

Justin Sullivan/Getty

In response to withering criticism for its role in perpetuating fake news, Facebook has recently signaled to users, regulators, and politicians that it is finally getting its house in order. Over the last few weeks, Facebook has announced changes to its News Feed that would prioritize content from friends and family; introduced features that will allow users to “rank” news outlets based on trustworthiness; and acknowledged mistakes in its handling of fake news and general abuse of the site’s platform.

Appearing in Munich on Monday, Facebook’s Vice President Elliot Schrage admitted that the company needs to improve. But he also set clear limits on how far the company would go in remedying its predicament. According to Axios, Schrage insisted that users should be responsible for determining Facebook’s content, not experts: “Schrage said that Facebook itself shouldn’t be the one to decide which news to promote and said that, in a polarized world, turning things over to any third party simply ‘invites criticism [of] who that body of experts is.’”

However, in setting these hard limits, Schrage is inadvertently getting at two larger problems the social network is facing. The first is that the changes Facebook has made so far are largely cosmetic, which suggests it is principally concerned about keeping regulators at bay, rather than ensuring that its billion users aren’t hosed with lies and misinformation. The second is that Facebook’s problems—its singular ability to sow discord between people and spread fake news—are baked into its DNA.

When times are good, Facebook has positioned itself as a gift to humanity, such as during the Arab Spring, when protesters used the social network to organize. But for the most part it sells itself as a neutral conduit for interactions between people and brands. Given its global ambitions, it had no other choice. Facebook was for everyone, everywhere: for liberals and conservatives, but also for people living in democratic and authoritarian countries. This neutrality was given a noble sheen—the social network as a kind of digital Switzerland.

But this ideology of neutrality is also a very useful public relations exercise, one that gave the company’s ferocious expansion a virtuous cover. This tension was brought to the forefront when Facebook’s partnership with Filipino strongman Rodrigo Duterte, whose regime has killed thousands, was made public in December, two months after CEO Mark Zuckerberg proclaimed that he didn’t “want anyone to use our tools to undermine democracy.” Facebook, clearly, is a tool for despots and the opposition.

In response to accusations that Facebook was used by Russian agents to influence the 2016 election, and amidst the growing sense that social media has helped drive cataclysmic events like Brexit and the election of Donald Trump, the company first mocked the notion that it could have had any role in influencing politics. “Personally, I think the idea that fake news on Facebook—of which it’s a small amount of content—influenced the election in any way is a pretty crazy idea,” Zuckerberg said two days after the election. But as political pressure increased, Facebook has withdrawn into a defensive crouch, making minor but very public tweaks, all while reinforcing its ideology of neutrality at every opportunity.

Zuckerberg’s announcement two weeks ago that the company was pivoting away from news was part of this strategy. Facebook, Zuckerberg said, was getting back to its roots. Divisive news stories were being phased out, while pictures of your friend’s babies and dogs were being phased back in. As I argued last week, this shift was partly aimed at regulators, with Zuckerberg making it clear that this massive, unregulated, quasi-monopolistic media company was not some kind of threat to democracy as we know it. But this rebranding was also aimed at users. Facebook has gained a reputation as a 24/7 Thanksgiving table—a place for friends and family to come together and fight over politics. Facebook was acknowledging that a growing number of people find spending time on the social network to be, well, unpleasant.

But the changes don’t go far enough. Case in point: Schrage’s reluctance to assemble experts to monitor content on Facebook. In fact, putting together an independent panel to address the company’s growing disinformation problem would be a strong step. Twitter, which has taken its fake news problem much more seriously, has embarked on a similar strategy with promising results. Tellingly, Schrage is arguing that it wouldn’t work for Facebook because it would “invite criticism”—because it would, in other words, create a PR headache for the company.

It’s likely that Facebook is overlearning lessons from its short, disastrous experiment in news curation. In 2016 it fired its skeletal editorial staff—the people responsible for selecting Facebook’s highly coveted “trending topics”—after former staffers revealed that they suppressed news from conservative sites. Schrage is basically saying Facebook wants to avoid repeating that mistake, which saw conservatives protesting alleged bias.

Unfortunately, this laissez-faire attitude is exactly what got Facebook into this mess. The company has regularly blamed bad actors when its services are abused, while ignoring how its platform facilitates that abuse. Facebook is facing the first genuine crisis in its decade-plus existence, with government regulators preparing to pounce and users increasingly souring on it. But thus far, the actions it has taken to solve the problem not only have been minor, but also have exacerbated what has made Facebook so problematic in the first place. The answer for Facebook isn’t to retreat into neutrality, but to move in the opposite direction.