Holding Center: Facebook’s disinformation policy is not wrong
Let there be no doubt that the public spread of disinformation poses an existential threat to our democracy.
In 2016, a concerted campaign by Russian actors targeted swing voters to sway them toward voting for then-presidential nominee Donald Trump. Some of the ads tried to convince Democrats that they could vote by texting “Hillary” to an unlisted number. More recently, a video ad from the Trump campaign spread the unevidenced claim — or lie, if you prefer — that former Vice President Joe Biden offered Ukraine $1 billion to fire the prosecutor investigating his son Hunter Biden’s company.
Facebook’s refusal to remove the video, and its broader policy to not fact-check political ads run on the platform, has drawn considerable criticism, with Sen. Elizabeth Warren calling the platform a “disinformation-for-profit machine.” Even Facebook employees have called for the company to amend its policy on political misinformation, but Mark Zuckerberg, in the supposed interest of free speech, has refused to change the company’s position.
It’s important to remember what’s at stake here: Democracy, which depends on a shared set of facts, has begun to flounder as we cordon ourselves off into separate echo chambers.
We cannot have fair elections if people vote based on misbeliefs, like claims that Biden is guilty of the same sort of alleged misdeed that Trump actually did commit. The ongoing impeachment inquiry attests to the reality that Trump withheld aid from Ukraine to pressure them into investigating Biden and his son. Trump and his campaign have popularized the idea that the opposite is true.
This muddying of the water prevents fact from prevailing over fiction in elections, the impeachment process and the public eye. The problem clearly needs to be addressed, but the best solution is not so clear.
Unlike Facebook, Google has decided to ban all political ads targeting voters based on their political preferences or voting records.
Though the intent of this policy is to target disinformation, the true impact will land on smaller, grassroots political campaigns that don’t have the funds for broad and unspecific advertisements. The actual problem will remain largely unchecked, as disinformation campaigns are largely built on the sharing of troll-generated content rather than the purchase of political ads.
Twitter took a more heavy-handed approach, banning all political ads from the platform. CEO Jack Dorsey argued that “political message reach should be earned, not bought.” The idea is sound — that selling political reach undermines the democratic ideal of social media: Each idea gets equal standing among a wider audience.
Despite its good intent, the choice is a cop-out.
Like Google’s new policy, the ban does not target troll farms responsible for the majority of disinformation campaigns. Additionally, the policy targets all political advertising, which includes those with “the primary goal of driving political, judicial, legislative, or regulatory outcomes.” This means that a climate change organization would be prevented from buying ads on Twitter, whereas an oil company would have free reign to promote false or misleading information. Educational ads on immigration and health care reform would be similarly barred from the platform.
Facebook’s hands-off approach on political lies is not popular, but at least it isn’t rash: It doesn’t harm down-ballot candidates or hamper positive political reform. The wide reforms put in place by Google and Twitter are attractive. They seek to address the growing problem of mistruth in politics, but the real problem can hardly be solved by eliminating political ads altogether.
Ultimately, it is up to people to check facts and discern truth, not the platforms to broadly prevent speech — true or false — from reaching the public. After all, asking Facebook to fact-check every political ad that crosses its platform allows a private corporation to be the arbiter of truth for over 2 billion people. This is a tremendous and troubling responsibility, and it should not be left to any one institution.
This is an exciting moment in our history: The problem of political disinformation has been thrust into the center of our attention. We are weighing our values of truth and free speech against one another, and so our decision must be deliberate, not based on knee-jerk reasoning.
If Facebook wishes to be measured in its approach, not limit speech in broad strokes, we might do well to respect their restraint.
Dillon Cranston is a sophomore writing about politics. His column, “Holding Center,” ran every other Wednesday.