You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Facebook Is Designed to Spread Covid Misinformation

The White House wants Facebook to crack down on anti-vax posts, but that would require the social media giant to transform on a fundamental level.

Bill Clark/Pool/Getty Images

The Biden administration went to war with Facebook late last week, blaming it for the persistent and dangerous anti-vax sentiment gripping a significant chunk of the country. Surgeon General Vivek Murthy blamed “technology platforms” for spreading misinformation, while White House Press Secretary Jen Psaki called for more strident social media policies, saying, “You shouldn’t be banned from one platform and not others for providing misinformation.” The president himself was even more direct, saying on Friday that Facebook is “killing people,” and rather than walking that back, the White House doubled down, telling Fox News, “They’ve been withholding information on what the rules are, what they have put in place to prevent dangerous misinformation from spreading, [and] how they measure whether it’s working.” (On Monday, President Joe Biden said that his comments shouldn’t be taken literally and were meant to inspire action against misinformation.)

These concerns about misinformation seem justified given the virulent delta variant and mounting vaccine hesitancy, but the right has greeted this rhetoric with alarm. In recent weeks conservatives—led by former President Donald Trump—have accused the administration of colluding with Facebook to censor posts and information deemed verboten. The reality is more complicated. Facebook and the White House were, according to The Wall Street Journal, communicating privately in recent months over the problem of medical misinformation (the White House also met with other social media platforms). But the two camps’ disagreements have hardened into a public rift that exposes some irreconcilable challenges of the present media environment. Facebook has become a petri dish, where some of the country’s entrenched problems—problems that touch health care, education, and trust in public institutions—are allowed to commingle and dangerously metastasize. For that, we have no one to blame but Facebook. But whether Facebook can overcome its existential—and perhaps terminal—contradictions is far less certain.

At its present scale, there may be no solution to Facebook’s content moderation problems—or to mollifying conservative anger that sees any attempt at moderation as equivalent to “censorship.” But that hardly absolves Facebook of its responsibility for creating a networked space for militant anti-vaccine campaigns that leverage Facebook’s own advertising and recommendation systems to deliberately spread disinformation to millions of people. The problem isn’t so much that Facebook refuses to censor messages that harm public health, it’s that the platform was seemingly designed to funnel those messages to as many people as possible.

Facebook is not a level playing field for speech and counter-speech, especially on matters of confirmed fact, like the efficacy of vaccines. Instead, Facebook is an algorithmically driven, automated surveillance and advertising machine, an addictive, personalized propaganda apparatus and informational filter that, despite the occasional content warning, does more to reinforce false beliefs and confirmation biases than to challenge them. (“Disinformation” refers to deliberate falsehoods, “misinformation” to untrue things that the sharer doesn’t know are false; both are problems on Facebook.)

Conscious of these dynamics, a small number of people are getting rich and building political fan bases off using Facebook to spread disinformation that’s harmful to the public interest. According to one now widely cited study (which Psaki referred to), 12 people, ranging from Robert F. Kennedy Jr. to a handful of formerly obscure osteopaths, are responsible for about 65 percent of Covid-related misinformation on Facebook. They are similarly prolific on Twitter and Instagram.

On Friday, the most popular vaccine-related Facebook post in the United States was an anti-vax message from far-right Representative Marjorie Taylor Greene. A tech watchdog group found that 11 out of 15 highly performing vaccine posts were “negative or anti-vax.” The most popular Covid post on Facebook last month, with four million views, was apparently a video filled with dark innuendo and false information from conservative provocateur Candace Owens. On Facebook, wrong information about vaccine-related deaths and secret tracking chips has doubled in recent months, according to a consultancy cited by the Journal.

The connection between social media campaigns and “vaccine hesitancy,” as some scholars call it, was well documented before Covid-19. Now an already existing anti-vax movement is fusing with right-wing resentment, QAnon-style mysticism, and lockdown discontent, all of it juiced by a media environment rife with misinformation and amplification of unproven claims. “Facebook has a Covid disinformation problem because Facebook has a right-wing extremism problem,” said the Real Facebook Oversight Board, an advocacy organization critical of the social media giant, in an emailed statement. “By remaining the platform of choice for insurrectionists, extremists, and far-right radicals, Facebook continues to wallow in filth—from Qanon to January 6 to Covid disinformation.”

Facebook, for its part, has acted quickly in publicly rebutting such accusations. In a post titled “Moving Past the Finger Pointing,” Guy Rosen, Facebook’s VP of integrity, denied the White House’s unproven “allegations” and touted Facebook’s role in increasing vaccine acceptance among its users. “The data shows that 85% of Facebook users in the US have been or want to be vaccinated against COVID-19,” wrote Rosen. “President Biden’s goal was for 70% of Americans to be vaccinated by July 4. Facebook is not the reason this goal was missed.”

Amid competing interests and complicated ethical questions, the stakes are high: Who’s responsible for slowing Covid-19 vaccine uptake and what’s been called a new pandemic of the unvaccinated? What can be done about rampant misinformation on social media platforms, particularly Facebook? How do platforms manage misinformation without becoming the would-be censors conservatives accuse them of already being? And how do we reconcile traditional notions of free speech with the opaque private monopolies overseen by companies like Facebook? The answers are not always self-evident, nor are they easy to litigate on cable news.

In the potentially deadly struggle over vaccine misinformation, the battle can seem Manichean, with Facebook’s unrelenting critics accusing the company of being complicit in mass death, or even profiting off it. There is certainly little reason to sympathize with a surveillance-capitalist monopoly like Facebook or its leadership, and Biden’s tetchy, reflexive opposition to whatever the company does is likely to be popular with the public, not to mention genuine. Whereas Donald Trump enjoyed, until January 6, a mostly quiet and harmonious relationship with Mark Zuckerberg, Biden has been beating this drum for a while. “I’ve never been a big Zuckerberg fan,” said Biden to The New York Times in January 2020. “I think he’s a real problem.”

The Facebook CEO, of course, isn’t the one who has made vaccine hesitancy an acceptable position in the culture wars. If Trump had promoted vaccine uptake from the beginning—and not just bragged about his role in Project Warp Speed—we might be facing a far different situation.

Still, it’s not unreasonable to be leery of the Biden administration’s newly aggressive posture toward Facebook; we should heed the alarm bells of authoritarianism and public-private collusion it sets off. However desperate the situation is, we should be careful of unintended consequences when a tech giant, pressured and empowered by an impatient federal government, faces calls to crack down on what people are saying online. The term censorship still feels insufficiently nuanced or descriptive for what’s happening here, though. The core problem remains that Facebook, by design, funnels people into groups and pages containing the kind of false information that’s been documented to have a material effect on public health. The company can do something about that—and maybe the company is the only actor here that can make a difference. Whether Facebook can handle the fallout, or our political discourse can survive the unintended consequences, is still up in the air.