Facebook's new policies on the U.S. election are so narrow that there's no chance they'll have any impact on the political discourse and news consumption across the site, leaving the same holes open for misinformation to spread.
The social media giant announced Thursday that it will ban new political ads the week before the Nov. 3 election and enforce other policies around mail-in voting and the Covid-19 pandemic.
The announcement wasn't just a poorly executed PR stunt from Facebook, it's a doubling down of the policies that have allowed a poisoning of the platform for years.
Despite Facebook's flashy announcement, the site is going to look the same as it has all year, with some very minor exceptions. It's much ado about nothing.
Let's break it down as simply as possible:
You'll still see political ads on Facebook, even during the week before the election. The political ad ban only affects new ads submitted after Oct. 27. Political ads submitted before then will still run, and the advertisers will still be allowed to adjust the targeting on those ads so they reach the people they want to reach. Those ads can also contain lies or misinformation, based on Facebook's existing policy. The only thing this will prevent are political ads about last-minute issues that arise in the final seven days of the campaign.
Also, there's nothing stopping political advertisers from front-loading their ad buys on Facebook before the ban goes into effect so they can still run right up to Election Day.
Facebook's changes go into effect just a week before the election, after millions will have already voted. Between early voting in states where it's available, mail-in voting and absentee voting, a record number of voters, up to 80 million, are expected to cast their ballots before Election Day, according to an analysis by The New York Times. Facebook's changes to those voters' media diets will come far too late for millions to make an informed decision.
Users, including political candidates, will still be able to spread false information about mail-in voting and the pandemic. The only content that's explicitly banned in the new policy are posts that say you'll catch Covid-19 if you go out and vote. If you post that, Facebook will remove it. Other false or misleading content related to voting and/or the pandemic will be labeled with a link providing accurate information.
But you have to ask yourself: In such a polarized environment, are users more likely to believe false claims posted by their favorite candidate or a link to more information posted by Facebook? I think we all know the answer to that.
Political candidates will still be able to claim victory or cast doubt on the election results, even if the election hasn't been called yet. Again, Facebook will add a link to those posts with accurate information. But candidates will still be allowed to claim victory before the election is called or claim election fraud if they lose, even if there's no evidence to back up those claims. It's difficult to believe that Facebook's fact-checking will have any real impact on what people think.