Blog Post

Facebook Whistleblower Reveals How They’re Dropping The Ball On Election Disinfo

Facebook is not fulfilling their claims to do everything they can to tamp down combat disinformation and prevent social harm through their platform.

This fall, Frances Haugen changed the narrative about Facebook with her whistleblower testimony and release of internal documents to the Wall Street Journal. Her revelations were many, and included the fact that Facebook is still failing to deal with disinformation on the platform. This includes disinformation about voting and elections, which continues to spread. 

From Haugen’s actions and testimony, we’ve learned even more about Facebook’s inability and unwillingness to deal with disinformation on the platform, even including threats of violence and incitement to violence — which is acted on by the platform less than 1% of the time. One such example of Facebook’s awareness of problems with the algorithm is an internal experimental report titled “Carol’s Journey to QAnon- A Test user Study of Misinfo and Polarization Risks Encountered Through Recommendation Systems,” which found that if a brand-new account follows a few high-profile verified conservative pages, it receives “polarizing” page recommendations within a day. These pages are some of the biggest culprits in spreading election disinformation, and it doesn’t stop there. Within two days, the account receives conspiracy recommendations. Within a week, the account receives Qanon page recommendations, which are also rife with election disinformation.

We also learned from the complaints that many political figures are “whitelisted” and not acted on, or “actioned” through suspensions or bans, even when they repeatedly violate policies, including civic integrity. Existing controls for civic integrity were on for the election, but, as Haugen said in her 60 Minutes interview, “…as soon as the election was over, they turned them back off or they changed the settings back to what they were before to prioritize growth over safety.” More revelations will likely follow, but the lesson is clear — Facebook is unwilling to make changes that benefit the public good but might result in lowered revenue or PR issues. This prevents structural changes that would make the negative impacts of Facebook on democracy less harmful. 

This new information shows that Facebook is not fulfilling their claims to do everything they can to tamp down combat disinformation and prevent social harm through their platform. There is much more that can be done to prevent disinformation. This is especially necessary because Facebook is backing off on its enforcement of election disinformation. As Vice President of Campaigns at Common Cause Jesse Littlewood wrote in USA Today, our white paper found that “85% of the most shared Facebook links on voting and more than half of individual links on that platform with the most engagement came from right-wing outlets that frequently spread disinformation on voting and elections.” Not only this, but many of the currently-active posts we surveyed were not actioned, despite the fact that once they would have been.   

There is no off-year for election disinformation. The Big Lie persists, and a full 36% of Americans falsely believe that the election was stolen from Donald Trump. When around 48% of people get their news from social media, this means that social media platforms are critical to the spread of disinformation. The false claim of election fraud continues to affect our democracy — election officials are still being harassed and threatened, voters are being intimidated by groups going door-to-door to ask who they voted for, and state legislatures have passed 33 laws in 19 states to restrict the vote. Facebook needs to act now to prevent election disinformation from severely impacting the 2022 midterms — and beyond. 

Close

Close

Hello! It looks like you're joining us from {state}.

Want to see what's happening in your state?

Go to Common Cause {state}