Report

Under the Microscope

Election Disinformation in 2022 and What We Learned for 2024
BY Emma Steiner

About Common Cause Education Fund
The Common Cause Education Fund is the research and public education affiliate of Common Cause, founded by John Gardner in 1970. We work to create open, honest, and accountable government that serves the public interest; promote equal rights, opportunity, and representation for all; and empower all people to make their voices heard in the political process.

Acknowledgments
This report was authored by Emma Steiner with key assistance and editorial guidance from Jesse Littlewood, and published by Common Cause Education Fund. Common Cause Education Fund’s disinformation work has been provided by the FJC- A Foundation of Philanthropic Funds, the Minami Tamaki Yamauchi Kwok & Lee Foundation, the Trusted Elections Fund, the Marks Family Foundation, and the Leadership Conference on Civil and Human Rights. We are incredibly grateful for their support.

We thank Allegra Chapman, Marilyn Carpinteyro, Sylvia Albert, Kathay Feng, Susannah Goodman, Liz Iacobucci, Sam Vorhees, Stephen Spaulding, and Aaron Scherb for guidance and editing; Melissa Brown Levine for copy-editing; Kerstin Vogdes Diehn for design; Jack Mumby and Ashlee Keown for digital support; and Katie Scally and David Vance for strategic communications support.

Special thanks are due to the volunteers of Common Cause Education Fund’s Social Media Monitoring Program, led by Raelyn Roberson, and the volunteers of the Algorithmic Transparency Institute’s Civic Listening Corps, led by John Schmidt, who spent thousands of hours monitoring social media for disinformation that could disenfranchise voters. Many of the examples in this report came from volunteers in these programs.

Copyright © July 2023 Common Cause Education Fund.

Download Under the Microscope Report

 

Executive Summary

Election disinformation looms large with the 2023 elections underway and the high-profile 2024 races already unfolding. As we enter these new disinformation threat environments, we take a look back at what we learned about election disinformation and its continuing evolution from previous election cycles through to today.

In our 2021 report on election disinformation, As a Matter of Fact, we illustrated how election disinformation causes digital and social harms and included recommendations for lawmakers and platforms in the lead-up to the 2022 elections.

The 2022 elections presented a wide array of new challenges—and a new starting point for election disinformation. Now, a significant portion of the electorate and the people running to represent them believe—and find it convenient to say—that our elections are illegitimate. We’re reporting back on the challenges voters faced, our efforts to disrupt disinformation in 2022, and what challenges lie ahead for the 2024 elections.

First, this report details our key findings from the 2022 election cycle and provides definitions for key terms. The report is then broken into sections detailing the lead-up and state of play heading into 2024, findings from our nonpartisan Election Protection coalition’s successes and lessons learned, and what lies ahead, as well as recommendations for legislative proposals to protect voters.

Section One,The Lead-Up to 2022,” covers the role of profit in election denial, the outsized influence of social media platforms, threats of political violence, and information gaps.

Section Two, “Common Cause Education Fund’s Work in 2022,” covers our work identifying, flagging, and removing election disinformation, inoculating voters against disinformation, the role of partnerships in our Election Protection work, our work with media, and two case studies demonstrating successful interventions.

Section Three, “Looking Ahead,” covers the role of election disinformation in candidacies, how tech platforms are backing down from enforcing their policies against disinformation, the remaining threat of political violence, how election disinformation fuels voter suppression through attacks on the voting process, and legislative solutions.

Key Findings/Points

The year 2022 was a more challenging environment for voters than 2020, despite growing familiarity with new tools and methods of voting in a pandemic.

  • Rightwing partisan influencers, politicians, and activists alike used the popularity of election denial to build audiences and profit off of lies about voting and elections.
  • Due to profit incentives and competition, the tech industry further relaxed already inadequate standards for content moderation around election disinformation and showed no interest in changing algorithmic design to tackle the issue.
  • The 2022 elections were the first federal election since the January 6th insurrection, and, as such, advocates feared a resurgence of political violence and domestic violent extremism—which continue to be a threat today.
  • The potential for vulnerable populations who exist within information voids and news deserts to be targeted by disinformation was greater than ever.

Nevertheless, the nonpartisan Election Protection community scored some key successes as a coalition to protect voters from voter suppression.

  • Common Cause Education Fund’s Social Media Monitoring program discovered emerging disinformation narratives about elections and voting and pushed realtime intelligence about disinformation to the Election Protection community.
  • We worked as a coalition to promote positive, pro-voter inoculation content about the importance of election workers, counting every vote, and other hot-button voting and election issues.
  • As a coalition, we implemented lessons learned from 2020 to build on messaging and make sure it reached more audiences and touched on a wider range of subjects than before, keeping in mind the populations specifically targeted by disinformation.
  • We helped educate the media on how to accurately and responsibly report on election disinformation and saw noted changes in how information was conveyed to voters.

Despite these successes, we know that the road ahead won’t be easy. The 2024 election cycle will present unique challenges in addition to the ones that are now standard in elections.

  • Election disinformation is now essentially obligatory for nearly all Republican primary candidates and opportunists seeking financial benefit.
  • It will be even more difficult to rely on tech platforms acting responsibly and voluntarily enforcing their policies.
  • It will be even more difficult to rely on tech platforms acting responsibly and voluntarily enforcing their policies.
  • There is remaining potential for political violence to flare up in the wake of potential indictments of a presidential candidate and a primary focused on rehashing lies about 2020.
  • More legislative norms are being eaten away by lawmakers seeking to cultivate support from a base steeped in conspiracy theories—who introduce legislation premised on lies about elections.
  • Any institution, tool, or practice, even ones with bipartisan support and buy-in, can become a target of disinformation.
  • New legislation points a way forward for grappling with both emerging and existing threats to voters.

 

It will be even more difficult to rely on tech platforms acting responsibly and voluntarily enforcing their policies.

DISINFORMATION is false rhetoric used to mislead. In elections, it’s used to dampen turnout among some voters, mobilize others based on lies, or call into question the results if an opponent wins in an attempt to either overturn the election or profit off of the chaos. Disinformation can alter voter participation, potentially causing voters to miss their opportunity to vote if they are confused about the voting process (the time, place, and manner of the election) or choose to stay home (“self-suppress”) due to worries about intimidation, violence or other consequences. Election disinformation also alters public perceptions about elections and their security, thereby impacting legislation and democratic norms in the long run.

ELECTION DENIAL: New research by the Massachusetts Institute of Technology delves into the roots of election denialism, and finds that it is largely motivated by racial resentment: “Among Republicans, conspiracism has a potent effect on embracing election denialism, followed by racial resentment. Among independents, the strongest influences on denialism are Christian nationalism and racial resentment. And, although election denialism is rare among Democrats, what variation does exist is mostly explained by levels of racial resentment.”

Election denial is motivated by the belief that others’ votes are lesser and shouldn’t count, and that the only way forward is to overturn undesired electoral outcomes. This belief is racist at its very core and the forms it takes target members of marginalized populations. The fact that regardless of partisan affiliation, election denial is rooted in racial resentment is a reminder that any attempt to combat disinformation must acknowledge and uplift those most affected by it.page5image57984640 page5image57989008

Section One: The Lead-up to 2022

The Role of Profit in Election Denial

Right-wing partisan influencers, politicians, and activists alike used the popularity of election denial to build audiences and profit off of lies about voting and elections.

A small group of mostly right-wing personalities is responsible for “super-spreading” voter fraud myths, spawning millions of online interactions around false and misleading stories. For example, in the four week period from mid-October to mid-November of 2020, then president Donald Trump and the “top 25 superspreaders of voter fraud misinformationaccounted for 28.6 percent of the interactions people had with that content.”

This trend holds true with podcasts too. A new analysis from the Brookings Institution shows that political podcasts are a consistent vector of disinformation. In their review of 79 different political podcasts, Brookings analysts found that “10 prominent podcasters were responsible…for more than 60% of all the dataset’s unsubstantiated and false claims.” Whether it’s COVID-19 vaccine disinformation or election disinformation, time and time again, research has shown that the bulk of engagement on disinformation is driven by a few superspreaders across social media platforms and other forms of media.

Not only is it lucrative to sell ads on channels that promote popular conspiracy theories, but there are endless opportunities to fundraise off of an election loss.

Many social media influencers who spread disinformation start with election denial and use that to build an audience before moving on to spreading disinformation about other issues, such as COVID-19 denial and climate change denial. They are then able to use the audience they have cultivated to promote more and more false claims. Meanwhile, a cottage industry of election denying influencers, activists, and fake analysts has benefited from online election disinformation.

There are multiple revenue streams an election denier can tap into with election disinformation. Not only is it lucrative to sell ads on channels that promote popular conspiracy theories, but there are endless opportunities to fundraise off of an election loss. For example, failed Arizona gubernatorial candidate Kari Lake raised $2.5 million after her loss, pledging to fight her defeat in the courts and overturn the results of the election. Trump himself fundraised off of election denial, raising over $250 million for recounts, legal fights, and promises to “Stop the Steal,” most of which was spent on unrelated expenses. Issue One recently found an entire network of political consultants and companies who profited from supporting the campaigns of secretary of state candidates who promoted election denial.

Other vote suppressor groups like “True the Vote” fundraise for both themselves and for sham election reviews— the cost of which was over $9 million raised by Trump backers and given to contractors, who themselves profited from election denial. Some figures continue to tour the country giving talks as “experts,” continually raising funds to keep their sideshow going. In what Accountable. US describes as the “election denial industrial complex,” just a small group of attorneys, politicians, influencers, and vote suppressors are able to raise vast sums of money for their forays into election denial—and the opportunities to grift just keep coming

Social Media’s Outsized Role in Election Disinformation

Because of profit incentives and competition, the tech industry further relaxed already inadequate standards for content moderation around election disinformation and showed no interest in changing algorithmic design to tackle the issue.

Major social media platforms such as Twitter, Facebook, TikTok, and YouTube play an integral role in spreading information about our democracy. More than 70% of U.S. residents use social media, and half of the adults in the United
States “often” or “sometimes” get their news from social media. Social media has provided platforms for diverse voices and viewpoints, allowing users to find information from voices they trust. However, while social media has no doubt provided access to news and information and provided a voice and platform for many voices, research shows that social media has its darker side: it heightens polarization in this country, fuels white nationalism and racism by providing a space to organize and radicalize, and contributes to racialized disinformation and organized hate against marginalized groups. The internet affects who is targeted by election disinformation and who has access to reliable information online.

Existing moderation standards on social media networks have gaps that disinformers exploited and continue to exploit to profit from false claims. It doesn’t help that negative incentives pressure social media platforms to decline to take action—after all, vitriolic engagement is still engagement, and it brings clicks and eyeballs to sites that are in fierce competition against each other. There’s political, social, and financial pressure not to remove bad actors for fear of impacting revenue streams and inciting backlash.

The January 6th Select Committee also found in their unreleased report on social mediathat “platforms’ lax enforcement against violent rhetoric, hate speech and the big lie stemmed from longstanding fear of scrutiny from elected officials and government regulators.” This situation persisted past 2020—in Common Cause Education Fund’s 2021 report Trending in the Wrong Direction, we found that in many cases, platforms backed down on their existing civic integrity policies without saying anything, as opposed to their announcements when they instituted the policies. While in the immediate aftermath of the 2020 election, social media platforms acted swiftly to react and remove inciting content, within months, these policies were relaxed—and posts that would have been actioned were allowed to gain massive engagement once more. The problem was exacerbated in languages other than English, with ever fewer resources dedicated to, for example, Spanish language moderation and fact checks.

As shown by Common Cause Education Fund’s 2021 report and our 2021 white paper, moderation at major tech platforms has been inadequate at best and backsliding at worst. Civil society groups thought that if we could point out places where moderation wasn’t happening, social media companies would engage with us, fix it, and learn to prevent gaps in enforcement in the future. But during the 2022 election cycle, recent tech layoffs made it difficult for civil society advocates to even know where to reach out—and made it harder for platforms themselves to conduct the basic functions of moderation. To account for this, we raised our concerns more publicly.

For the 2022 midterm, given platforms’ backsliding and impenetrable moderation standards, Common Cause led 130 public interest organizations to draft and submit a letter to leading social media platforms, advising them to monitor and reduce mis- and disinformation through implementing the following: “auditing algorithms that look for disinformation, downranking known falsehoods, creating full time civic integrity teams, ensuring policies are applied retroactively—i.e., to content posted before the rule was instituted—moderating live content, sharing data with researchers and creating transparency reports on enforcement’s effectiveness.” We know what platforms need to do to reduce the spread of disinformation they just refused, and continue to refuse, to make better choices for user safety.

 

Threats of Political Violence

The 2022 elections were the first federal election since the January 6th insurrection, and, as such, advocates feared a resurgence of political violence and domestic violent extremism—which continue to be a threat today.

Going into 2022, we had reason to be nervous about the potential for political violence. It was unknown how vote suppressors and election deniers would react to electoral outcomes, and there were new trends of targeting voters and continued targeting of election workers.

One telling example from 2020: after the Trump campaign took video footage of Fulton County, Georgia, election workers Ruby Freeman and Shaye Moss out of context to claim that they were engaged in fraud, the two women were targeted by mass harassment and threats. This was egged on not only by Trump’s lawyer Rudy Giuliani but also by Trump himself in widely viewed social media posts alleging crimes. Trump even mentioned Freeman over a dozen times in his infamous call asking Georgia officials to overturn the election. This campaign of harassment led to death threats and visits to the women’s homes, and resulted in Moss and Freeman having to flee their residences. Trump even amplified attacks on Freeman after the release of her testimony, years later, to the January 6th Select Committee, asking “What will the Great State of Georgia do with the Ruby Freeman MESS?”

Voters also feared intimidation at polling locations: A poll from fall 2022 showed that “35% of Black Americans believe violence is likely or very likely at their polling place in November.” Reuters reported that 40% of voters were concerned about intimidation at the polls. One particular trend of concern was drop box surveillance, organized by election deniers on Truth Social. Election deniers with militia ties announced their intent to stand guard at ballot drop boxes and ceased action in Maricopa County only in response to a court order.

A poll from fall 2022 showed that “35% of Black Americans believe violence is likely or very likely at their polling place in November.

Information Gaps and Vulnerable Voters

The potential for vulnerable populations who exist within information voids and news deserts to be targeted by disinformation was greater than ever.

As we entered the 2022 midterms, researchers like those at the Brennan Center for Justice at NYU Law warned that information gaps—“when there is high demand for information about a topic, but the supply of accurate and reliable information is inadequate to meet that demand”—would present an issue for voters. This was further exacerbated by the fact that disinformers were relying on disinformation from 2020 to set a foundation for disinformation in 2022. The Brennan Center cited declining voter trust in elections and lack of public outreach about changes to voting procedures.

Heading into the midterms, only 47% of Americans polled had a “great deal” of confidence that 2022 votes would be counted properly, and Election Protection advocates had to thread a difficult needle of reassuring voters who had been exposed to election disinformation while also encouraging turnout. page9image31617008 page9image31615968

Meanwhile, nonincumbent political candidates were adding to the problem. A study last year from NYU’s Center for Social Media and Politics found that “politicians in the 2022 election are sharing more links to unreliable news sources than they did in 2020, and the increase appears to be driven by nonincumbent Republican candidates.” The partisan difference in usage of unreliable sources was staggering: “36 percent of news that Republican candidates shared came from unreliable sites, while that was true for only 2 percent of news shared by Democratic candidates each day.”

Another study found that YouTube’s algorithm shows more election fraud content to accounts already “skeptical” of elections, creating a feedback loop of conspiracy content. As the 2022 threat framework from the Election Integrity Partnership detailed, social media disinformation was particularly suited for viral spread because of factors like the potential for massive engagement. This meant that people in news deserts, people targeted by disinformation, and people who rely on social media for news would potentially be more exposed to disinformation about the election.

 

Section Two: Common Cause Education Fund’s Work in 2022

Common Cause Education Fund’s Social Media Monitoring program discovered emerging disinformation narratives about elections and voting and pushed real-time intelligence about disinformation to the Election Protection community.

As one of the co-leads of the Election Protection coalition, Common Cause Education Fund took a leadership role in developing the strategy to combat election disinformation in 2022. We were uniquely positioned to reduce the impact and spread of disinformation through our Stopping Cyber Suppression program and through engagement with our national and state partners. Our interventions on social media protected voters and helped train grassroots volunteers to defend themselves and their communities from disinformation.

Identifying, Flagging, and Removing Election Disinformation

Over the 2022 primaries and general November election, we recruited and trained 2,202 monitors who in total submitted 3,825 items of potential social media disinformation for review to our team. On Election Day itself, 156 Common Cause volunteers (plus an additional 44 youth volunteers) gathered over 750 items of potential social media disinformation. Hundreds of additional volunteer monitors, whom we helped train, worked with our state and local partners on the ground. We were additionally involved in the Georgia Senate runoff with 33 social media monitoring volunteers.

When we receive potential disinformation, we triage it based on importance and impact. For content that might violate social media platform policies, we immediately report and request removal of the content. We also review content to determine if it is a growing trend or narrative of disinformation, looking at the content gathered by our volunteers and by partners. We then create pro-voter talking points and social media posts that push back on the disinformation narratives and circulate those to our national and state partners.

Removing disinformation from social media platforms was challenging and is only becoming more so. However, we were able to remove over 300 social media posts across Facebook and Twitter in the 2022 election cycle. Some of these posts were threatening in nature, not only targeting specific individuals but creating a climate of fear around voter participation.

CASE STUDY: Stopping Threatening “Ballot Mule” Disinformation

In early 2022, a film named 2,000 Mules by known disinformation spreader Dinesh D’Souza was released. The film was created in collaboration with voter suppression and disinformation group True the Vote and falsely alleges mass “ballot trafficking” by so-called ballot mules and makes explicit false claims about election and voting procedures. We were on alert that these false claims could become a viral disinformation narrative on Facebook, Twitter, and other platforms given the high-profile nature of the movie production team involved and the continued salience of election conspiracy theories as we headed into the 2022 midterms. And given the increase in intimidation and threats to elections officials and

voter protection groups created by election disinformation, we knew that this movie would turn up the temperature and increase the conditions that lead to threats, intimidation, and political violence.

Common Cause reached out to our contacts at Meta (Facebook/Instagram) before the movie was released with an early warning, alerted them to the issues that this would likely create, and again after the release, highlighted to social media companies the viral nature of this disinformation narrative and fact-checks from PolitiFact and the Associated Press.

Our monitoring program soon identified trending viral content on Facebook and Twitter related to this “ballot mule” narrative. It was being posted by multiple users across multiple social media platforms, either inspired by or coordinated with the film. Much of the content was threatening in nature—making accusations about individuals or groups that they were involved in “ballot trafficking” as a “ballot mule”—with no evidence (and none has since been presented).

Our monitoring did identify some “ballot mule” content that was manipulated media—posts that falsely invoked law enforcement action targeting an individual involved in elections. We took immediate action, contacting Meta and Twitter about posts on their platforms that contained this manipulated media that directly targeted an individual—posts that had racked up tens of thousands of likes, comments, and shares despite the threat.

While Twitter took action over the next few days to remove these posts and similar posts, it took outside pressure with two media stories about this issue until Meta finally responded that this content violated their policies. Even after that point, we could find examples live on the platform—and only when we reported them were they removed. In total, over 200 pieces of this kind of threatening content were removed from Facebook and Twitter.

In the end, our advocacy resulted in Twitter and Facebook both enforcing and updating their policies to make this kind of intimidation content prohibited on their platform, a major success that will keep this content off the platforms. That said, as our experience shows, we have to remain vigilant and continue to monitor to ensure appropriate action is taken.

Inoculating Voters Against Disinformation

We worked as a coalition to promote positive, pro-voter inoculation content about the importance of election workers, counting every vote, and other hot-button voting and election issues.

A second key component of our work to combat election disinformation is to stop disinformation from taking root in the first place—to “inoculate” audiences against potential disinformation. Numerous studies have shown that when individuals are provided accurate information about a topic from a trusted messenger, it reduces the impact of disinformation. While this is broadly true across different issue areas, in voting and elections, it is especially critical, as voters can miss their opportunity to participate if they fall for disinformation or choose to “self-suppress” based on false narratives.

Stopping election disinformation is imperative to achieving true multiracial democracy with equal participation. Voters most at risk from election disinformation are new voters and infrequent voters who don’t have as much experience navigating our elections system, voters with limited English proficiency (as the bulk of voting information is in English), and students and other transient populations who are not as likely to be engaged by the parties. Often it is voters of color (especially those in immigrant communities) and young voters who don’t have the information needed or experience with voting, which compounds the impact of election disinformation. Black and Latinx Americans are three times more likely than white Americans to be told they lack correct voting identification, to be unable to locate a polling place, or to miss a registration deadline. And more than half of voters under the age of 35 (who are more diverse than voters over 35) do not have the resources or knowledge they need to vote by mail and are therefore more susceptible to mis- and disinformation.

Common Cause worked with the Leadership Conference on Civil and Human Rights to create content that combines voter information and messaging with “prebunking” to stop disinformation before it takes root in a community. These messaging guides and example content were created and distributed to the entire Election Protection network.

One key element is localized content for specific states. Similarly, we coordinated the translation of inoculation content into the languages of communities that are targeted by disinformation. The state-level organizations that are trusted messengers in their communities must have the resources, strategy, and capacity to effectively inoculate their communities against disinformation.

Election officials are important sources of trusted information, and the National Association of Secretaries of State has a public education campaign designed to lift up and amplify the voices of elections officials. However, elections officials are often underfunded, understaffed, and have limitations on the reach of their content.

Thanks to the support from funders and partners, we invested additional resources in 2022 to update and expand our inoculation content that could be communicated by the diversity of trusted nonpartisan sources. In 2020, we found that the most shared and spread content used bright, engaging illustrations reflecting democratic values while sharing our key messages. We worked with illustrators to create dozens of images specific to the disinformation narratives we needed to combat (based on the intelligence we gathered from our monitoring).

In 2022, we also put a premium on translating our content into Spanish given the prevalence of disinformation in Spanish and the limited resources social media companies put into combating non-English disinformation.

page14image15043760 page14image15049696 page14image15053056page14image15045104

Using trusted messengers to communities is key for successful inoculation, and our partner network was critical to this effort. To make it easier to share inoculation content, we created an online searchable database for partner organizations. As disinformation narratives and threats changed, we continued to add content to this database as the calendar proceeded (moving to post-election inoculation content in the week before the election) and as new narratives came up.

Tracking reach is challenging, but through the use of the #OurElections hashtag and the analytics provided by partners, we believe we had millions of views of our content on Facebook, Twitter, and TikTok. In fact, our content was mentioned in other posts 15,000 times, engaged with by 60 million social media users (who clicked or engaged with the content), and viewed a total of 298 million times.

We also found that creating vibrant graphics that celebrated voter participation from all types of voters was necessary if partner groups were going to share these graphics and messages.

CASE STUDY: Elections Night Is (Still) Not Results Night

In any election, the results reported on election night are unofficial. Currently, no state requires that official results be certified on election night itself. States have different rules on how and when to process ballots, and often it can take time to count military and overseas ballots, as well as validated provisional ballots. At the same time, it is often clear who the winner of an election is on election night, and news organizations have often “projected” a winner based on their analysis of unofficial reporting from elections officials and projections of where outstanding ballots are coming from.

Many of us have grown accustomed to hearing these projections and “knowing” the winner of the election soon after polls close. But as states change their rules on when and how to count ballots (which can delay the time it takes to release unofficial results), and as more voters use vote by mail or have an issue at the ballot box and need to use a provisional ballot, it shouldn’t be expected to always have these unofficial results on election night.

Disinformation purveyors have used this expectation to create false narratives that claim that elections are somehow rigged or fraudulent if the unofficial results are not projected on election night or if these unofficial results (or media projections) change as counting—and the certification and verification process—plays out.

In 2020, a broad coalition of voter protection organizations joined forces to communicate this reality and push back on disinformation, highlighting that we should “count every vote” and that “election night isn’t results night.” This included grassroots communication through partner groups, traditional and social media outreach, media advisories, and much more.

page15image31623200 page15image31623408 page15image31623616 page15image15555584page15image15561632

 

The 2020 election was a unique election environment during the height of public concern around COVID-19. Elections officials and state officials expanded vote by mail and other voting options, and as a result, the use of vote by mail and other nontraditional methodsincreased dramatically from previous elections. That made the communications push to inoculate against “election night” disinformation more salient in 2020. By the time the November 2022 elections came up, there was a societal push to “return to normal,” rollback vote by mail in states, and the reluctance of the media to run the same story twice about “election night isn’t results night.” Yet we knew to expect disinformation attacks in this narrative and needed to again mobilize to communicate this inoculation message.

We held nearly a dozen briefings with national media outlets to highlight key disinformation narratives to expect. A key part of these briefings was to ensure that journalists with large platforms did not amplify disinformation when publishing related stories and that they understood the importance and impact of election disinformation.

In addition to leveraging media outreach, we mobilized our coalition to push pro-voter messages that defuse disinformation, including the “election night” messages.

Partnerships

As a coalition, we implemented lessons learned from 2020 to build on messaging and make sure it reached more audiences and touched on a wider range of subjects than before, keeping in mind the populations specifically targeted by disinformation.

To reach the most vulnerable voters and have the most impactful interventions on election disinformation, a large nonpartisan pro-voter coalition is necessary. The Election Protection coalition, led by Common Cause and the Lawyers’ Committee for Civil Rights Under Law, is the largest nonpartisan voter protection coalition with over 100 national and state partners that has been active for over two decades. While new efforts and coalitions are formed in each cycle, Election Protection maintains a presence throughout the “off years” (and in fact, engages in state, local, and special elections every year).

This makes for a robust network of organizations with a shared history and experience working with each other and produces remarkable results, including field efforts in dozens of states (where nonpartisan Election Protection volunteers assist voters at polling places), year- round engagement with elections officials, a nonpartisan voter protection hotline, and our Election Protection Anti-Disinformation Working Group, which Common Cause co-chairs and is the hub of strategy and information sharing between organizations. Our work would be significantly hampered if it was necessary—due to funding, staffing, or organizational decisions—to start a new election anti-disinformation effort from scratch each year.

Partnerships: States

Instead, because of our ongoing engagement with Election Protection and the network of state leaders, we have built-in communications pathways into and between states. Backbone organizations like Common Cause can drive research, analysis, and strategy—including messages and message framing—maintaining coordination with the full coalition of state and national organizations. Leading up to, on, and after Election Day, we regularly updated and convened with people in the field doing voter protection work, journalists covering democracy issues, and state voter protection organization leaders.

In addition to our inoculation content, Common Cause shared coalition-wide updates on how to counter specific disinformation narratives (in coordination with state partners) that we witnessed in real time. Many of these were directed to groups working with limited-English persons, like CASA Pennsylvania, APIA Vote Michigan, Voces de la Frontera (Wisconsin), and our core national partners, Arab American Institute Foundation, NALEO, and APIA Vote.

To use one group’s experience, the Arab American Institute Foundation was able to track an emerging disinformation narrative in Michigan and connect with leaders on the ground in California and Texas when it emerged. They noted that “this disinformation effort surprised local groups who were not familiar with this type of disinformation and how to counter-message. We were able to quickly gather key local allies and build their confidence with counter-messaging and reassurance.” Because of our partnership, they were able to help other states and local groups anticipate and work to disarm rapidly spreading narratives.

Another group, Voces de la Frontera, which is based in Wisconsin and works primarily in Spanish, was “surprised by how quickly we were able to detect and report fake content on all social media platforms. In the future this will allow us to navigate the web and media in a better way in upcoming elections.”

We also worked with partner groups to focus on communicating to voters in their communities with accessible information, inoculation content, and resources. As one subgrantee, Election Protection Arizona/Arizona Democracy Resource Center reported, “I think it made some of our community members feel like they are being heard instead of just being targets in GOTV campaigns.”

Partnerships: Election Workers

As one of the largest national and state-based grassroots organizations advancing democracy and transparency, Common Cause Education Fund works on the ground in 30+ states alongside local and state elections officials and administrators. Because we’re in constant communication with these government actors, we both inform them of worrisome trends requiring fixes and pass along their information and updates to the public at large.

Common Cause Education Fund works on the ground in 30+ states alongside local and state elections officials and administrators.

Since 2020, there has been a disturbing trend of rising threats and intimidation of election workers. Our anti-disinformation work removes threatening content and provides inoculation messaging that helps “turn down the temperature,” but additional support for elections officials is needed. Our sister 501(c)(4) organization, Common Cause, has helped successfully implement strong elections policies and practices across the states—in Colorado, for example, Common Cause helped pass the strongest set of reforms in the country—that not only ensure greater access for voters but additionally protect voters, and administrators, from disinformation, intimidation, and political violence.

page17image31636720  page17image14995392page17image14999312

For the 2022 midterm elections, we intentionally crafted and encouraged the spread of social media and offline content that contained positive messaging about election workers. We know that even some conspiracy theorists who promulgate false narratives about a “rigged election” believe that their own local elections are secure. In other words, when election workers are humanized as people like us and from our community, they are more likely (though not always) believed to be working to fairly administer elections. Our content and narrative-building work in 2022 helped spread the narrative that elections are run by us, and not by nameless bureaucrats imported from somewhere else.

Our Work with the Media

We helped educate the media on how to accurately and responsibly report on election disinformation and saw noted changes in how information was conveyed to voters.

For the 2022 midterms, our team dedicated itself to engaging an even higher number of journalists than in 2020 to inform them on how to responsibly report disinformation. This is an essential part of the puzzle because, unless done appropriately, writing about disinformation can exacerbate, rather than ameliorate, the problem. As such, last summer and fall, we held individual briefings for key reporters from ProPublica, the New York Times, the Washington Post, NPR, the Associated Press, Reuters, Bloomberg, and other outlets to provide them with background on our work to combat disinformation. A key part of these briefings was to ensure that these journalists—with large platforms—did not amplify disinformation when publishing stories we brought to their attention.

More tactically, we informed journalists of disinformation posts lingering on platforms to force action from social media platforms through highly publicized media pressure. In June, for example, we got posts targeting a specific individual for unproven ballot fraud successfully removed. And by placing stories in Bloomberg and ProPublica about a proliferation of posts targeting election workers, we successfully pressured platforms to both remove the posts and implement new rules.

In 2022, we saw a significant amount of “inoculation” stories from the media—particularly about certification and the voting process. We helped mobilize the Election Protection community to ensure that we were speaking with one voice on inoculation messaging. We didn’t want people to forget that results would take time in 2022 as well, so we highlighted the importance for the media to set expectations for election timelines. Inoculation stories from the media included positive stories about democracy in action, such as our nonpartisan voter protection work and the volunteers who make it happen.

Our emphasis on inoculation work in non-English languages also made an impact. In addition to creating content, providing strategy, and sharing resources with our nonprofit partners, as detailed earlier, our ongoing partnership with PolitiFact created 32 articles in Spanish on election issues, published in three major Spanish-language outlets. This filled the information gap about the election process in Spanish, prevented disinformation from taking root in limited-English-proficiency communities, and provided our Spanish-language partners with key resources to rebut or inoculate against Spanish-language disinformation.

Section Three: Looking Ahead

The year 2024 will present new challenges. As such, we’ll both continue the work we’ve honed over the last few cycles and move into new territory too.

Many Candidates Continue to Endorse Election Denial

Election disinformation is now essentially obligatory for nearly all Republican primary candidates and opportunists seeking financial benefit.

The same polling that demonstrates increased trust in elections still shows the continued impact of disinformation. Voters have more confidence in their local elections than in national elections, and 51% of Republicans “say they think people submit too many ballots in drop boxes either very or somewhat often.” This illustrates how even though the gap in trust is decreasing since 2020, election disinformation myths and perceptions of widespread fraud still persist.

Still, “one third of the [Republican] party’s 85 candidates for governor, secretary of state and attorney general”—officials who would be responsible for election oversight—“embrac[ed] Trump’s efforts to overturn his 2020 loss.” Half of them, incumbents in particular, secured seats in 2022. And 220 election skeptics who “cast doubt on the 2020 election,” three dozen of whom denied the 2020 results outright, won seats in the U.S. House of Representatives. Despite their positions in power, election deniers still can’t provide the evidence they claim they have. In one telling example, Arizona State Senator Wendy Rogers claimed that she couldn’t give evidence to agents because “she was waiting to see the ‘perp walk’ of those who committed fraud during the election.” Rogers was able to give further oxygen to conspiracies with this remark while also refusing to elaborate further.

The University of California San Diego’s Yankelovich Center finds that there’s a partisan gap in trust in elections: “Democrats are more than twice as likely as Republicans (85% versus 39%) to view the results of this [2022] November’s election as accurate, while Republicans are more than five times as likely (43% versus 8%) to suspect significant fraud.” There’s some hope, though: Bright Line Watch finds that “public confidence that votes were counted accurately at the local, state, and national levels increased after the election and beliefs in voter and election fraud decreased. The changes were generally largest among Republicans.” Additionally, a Monmouth poll finds that 55% of Republicans surveyed claim that Biden’s 2020 win was illegitimate (down from 69% in their last poll). Finally, the Carnegie Endowment for International Peace has a new report detailing why political violence didn’t materialize in the wake of the 2022 election: Trump may be unique at mobilizing supporters for it, among other reasons.

Despite these encouraging trends, election disinformation will remain an issue as long as it is lucrative, popular, and profitable for disinformers to promote it.

Tech Platforms Are Backing Down on Civic Integrity

It will be more difficult to rely on tech platforms to act responsibly and enforce their policies.

In dealing with platforms for the 2022 election, we experienced inconsistent applications of policy, conflicting information on violative content, and instances where we were simply ignored. This is all in accordance with a general trend: tech platforms are cutting down on staff dedicated to misinformation. For example, there is just one person left to handle misinformation policy at YouTube, and YouTube recently announced that they will no longer enforce civic integrity policies around 2020 election disinformation. Other platforms have similarly reduced their policy staff. As people continue to seek news from social media, the problem of disinformation will persist and intensify—and dwindling staff will be on hand at major platforms to deal with it.

One example of inconsistent application of policy is how major platforms have treated the restoration of Donald Trump’s accounts. Major platforms like YouTube, Meta, and Twitter deciding to restore Donald Trump’s accounts shows that either they don’t understand that the threat of incitement isn’t over, or they’ve chosen potential profit over people. Meta claims that there are new guardrails to prevent Trump from inciting further violence, such as “heightened penalties” for future violations.

In dealing with platforms for the 2022 election, we experienced inconsistent applications of policy, conflicting information on violative content, and instances where we were simply ignored.

They also say that the risk has “sufficiently receded.” It’s worth noting in response to that claim that we are still experiencing Trump-incited political violence, whether it’s an election denial shooting in New Mexico or an attempt to assassinate the Speaker of the U.S. House. It also appears that Trump is amplifying more and more extremist content on his social media site (while his accounts were restored, he is rarely posting on Meta and has not yet posted on Twitter)—Accountable Tech found more than 350 posts that would violate Facebook’s standards.

Meta says that “in the event that Mr. Trump posts further violating content [on Facebook or Instagram], the content will be removed and he will be suspended for between one month and two years.” If he posts content that isn’t violating, like “content that delegitimizes an upcoming election or is related to QAnon,” the spread of the post will be limited. Trump has amplified QAnon accounts more than 400 times on Truth Social, and the election denial movement as a whole is getting closer to QAnon. It’s concerning that delegitimizing the 2020 election or posting QAnon content isn’t considered violative, given the very real potential for violence.

If the hands-off approach to Donald Trump’s accounts is any indication, the Election Protection community will have new challenges to face due to social media platforms’ inaction.

The Remaining Threat of Political Violence

There is remaining potential for political violence to flare up in the wake of indictments of a presidential candidate and a primary focused on rehashing lies about 2020.

Violent rhetoric online is still motivating political violence offline: Paul Pelosi’s assailant, David DePape, made claims of a stolen election to police after being arrested for his October 2022 assault on the former Speaker’s husband. Further review of his online activity shows that he was steeped in conspiracy theories.

Election denial and conspiracy theories were also key motivators in the shootings of Democratic officials and elected lawmakers’ houses in New Mexico last year. Losing New Mexico House GOP candidate Solomon Pena orchestrated, and even participated in, shootings at the homes of elected officials he believed rigged his, and other, elections. Pena’s campaign website, which is still live, contains alarming rhetoric about the 2020 election, including the claim that the “offenders are not criminal defendants, they are enemy combatants.” Pena was also inspired, as Talking Points Memo notes, by the work of election deniers like David Clements. Texts from his phone included messages about certification and claims that “they sold us out to the highest bidder,” as well as the addresses of the officials targeted. Per the information of a confidential informant, Pena intended the shootings to cause harm and even participated in one himself.

The presidential primaries will begin soon, and one major candidate is facing several potential indictments. In response, Trump has called for his supporters to instigate violence on his behalf as indictments loom. Recent posts by Trump on his social media network, Truth Social, claim that “the Democrats used Covid inspired Mail In Ballots to CHEAT…. Now they are using PROSECUTORS to CHEAT,” and “the Democrats are using Prosecutors for purposes of Election Interference. It is their new way of CHEATING on Elections!” In this way, Trump is trying to connect his claims of a rigged election in 2020 to his new troubles—and trying to incite the same response from his supporters.

There were also attempts by Trump to incite violence against Manhattan District Attorney Alvin Bragg—culminating in Bragg receiving a suspicious envelope with white powder. Trump’s rhetoric aimed at Bragg contained claims of Soros ties and featured thinly veiled racist remarks. This type of inciting language will likely continue to be used and amplified by partisan disinformers as the election grows closer and has the potential to inspire further political violence.

Election Disinformation Fuels Voter Suppression

More legislative norms are being eaten away by lawmakers seeking to cultivate support from a base steeped in conspiracy theories—and introducing legislation premised on lies about elections.

It’s easier for elected officials to pass restrictive voting laws under the guise of election integrity if voters believe in any number of unfounded conspiracy theories keeping the idea of widespread partisan voter fraud at the forefront.

Election disinformation is thus the “tail that wags the dog” as states pass laws restricting voter access. Some politicians even fund specifically designed law enforcement units to find “voter fraud,” creating a vicious cycle of headlines about arrests for election crimes—despite the fact that most individuals prosecuted were in fact given wrong information by state employees. The goal of these voter intimidation squads is to depress the vote, especially in communities of color. And the idea is catching on, in states ranging from Virginia to Arkansas, which have proposed similar units. The Florida Secretary of State has even proposed an increase in size for its “election crimes unit” from 15 employees to 27, with a corresponding budget increase to $3.15 million.

Other states continue to introduce new legislation to restrict access to mail voting andaccess to drop boxes. Voting Rights Lab counts hundreds of bills introduced so far this year that reduce access to the vote and criminalize actions of election administrators. The Brennan Center for Justice has counted 150 restrictive voting bills introduced this year, ranging from bills that restrict vote by mail to bills that criminalize errors from election officials. Election conspiracies continue to provide the foundation for further voter restrictions, and even many new decisions made about election administration can be based on the myth of widespread fraud. For example, the majority of the Shasta County, California, Board of Supervisors voted to end their contract with Dominion Voting Systems over fraud claims and disinformation about voting machines: “Just because we’re all sitting up here and elected does not mean we had free and fair elections every single time,” said one supervisor.

Election-denier politicians are already starting to do what they did in 2020: coordinate to introduce vote-suppressive and anti-administrative legislation across the states. To do so, they’re resurfacing old rhetoric about voter fraud and election rigging to push photo ID laws, cut reforms that facilitate voting, and criminalize elections officials’ work. Look no further than one proposed bill in Kansas where drop box access would be highly restricted—for fear of “mules.” Not only would drop boxes be limited in this proposed bill but also video cameras would record the faces of voters dropping off ballots.

Lawmakers in Kansas cited the debunked documentary 2,000 Mules as motivation for the bill: “I think part of the concern that’s kind of driven bills like this has been partly the whole notion of what are called mules, as far as that somehow somebody’s going to stuff a ballot box akin to, you know, there was a documentary called ‘2,000

Election disinformation
is thus the “tail that wags the dog” as states pass laws restricting voter access.

Mules’ that came out a year ago.” And in Nebraska, a legislator who introduced a voter suppression bill didn’t endorse a belief in widespread fraud, but said “the perception is—there is. … And perception is reality.” Election disinformation, even when acknowledged by its proponents as false, is used to fuel legislative voter suppression under the guise of protecting elections.

Attacks on the Voting Process

Any institution, tool, or practice, even ones with bipartisan support and buy-in, can become a successful target of disinformation.

Not only are new restrictions proposed almost daily across the country, but existing rules and procedures are newly challenged. In Kansas, a grace period allowing mail-in ballot return of up to three days after an election came under fire, even as state legislators fighting it conceded that widespread fraud isn’t real: “I mean, people do question the fraud all the time. Is there fraud? I think actually we’re a fairly good state. But we can always make things better.” Even when lawmakers acknowledge that there’s nothing to the conspiracies they base bad bills on, they still cite the perceptions of election insecurity—that they themselves created—as a reason to advance them.

This isn’t solely limited to legislation, either. Election deniers and disinformers are able to take any voting process out of context and portray it as something nefarious to their audiences, and the limit is the disinformers’ own creativity. Parts of the process as mundane as what type of pen is used at the polls, how signatures are checked, how ballot tabulators work, and even how long it takes to announce results can and have been targeted for disinformation—and used to erode confidence in our election systems.

Legislative Recommendations:

New legislation points a way forward for grappling with both emerging and existing threats to voters.

As mentioned in Common Cause’s 2021 report on election disinformation, there are a number of federal and state laws that already exist to help protect the freedom to vote without intimidation. There are also several lesislative proposals that would further aid in the fight against election denialism and help protect voters through their focus on protecting election workers, tackling disinformation in political advertising, and fighting deceptive practices. While no bill will solve every issue we face, there are several bills that will protect access to the vote.

In addition to the bill recently introduced by Senator Amy Klobuchar and Representative Yvette Clarke, which would regulate AI-generated content in political advertising, the Freedom to Vote Act, recently reintroduced, provides a number of solutions for problems of voter intimidation and access. Not only does it increase access to the vote by promoting online registration and allowing for same-day registration, but it also establishes further protections for disabled voters and election workers. The Freedom to Vote Act also includes provisions against deceptive practices, such as prohibiting false statements about federal elections 60 days before an election that would prevent someone from exercising their right to vote.

Conclusion

Through Common Cause’s years of experience tracking, analyzing, and disrupting election disinformation, we’ve learned how it is made, how it is spread, and how to combat it. Disinformers operate by taking advantage of preexisting narratives and using social media to amplify them, seeking out audiences who may be more susceptible to disinformation about voting and elections or unable to identify the correct information on the subject.

At Common Cause, we see protecting our democracy, ensuring that voters can vote without barriers to the ballot box, and fighting back against all types of voter suppression as central to our work. That’s why our Election Protection efforts take place 365 days a year, not just in the weeks surrounding Election Day. As the threats to voter participation have grown, we have expanded our work to include the new frontier of voter suppression— disinformation, political violence, and election sabotage.

Understanding major disinformation narratives as they arise allows us to better prepare for and respond to attacks on the right to vote. We will no doubt see more acts of political violence, threats, and intimidation fueled by election disinformation in 2023 and 2024. We will publish a memo later this year that outlines the disinformation narratives we anticipate in 2024 and how we intend to combat them.

As these threats to democracy are linked, so is our response. The recommendations made here, if implemented, would have an appreciable impact on the threat of election denialism and disinformation.

Related Resources

See all Related Resources

Guide

Media Literacy Skill: Lateral Searching

"What do I do if my loved ones do not trust verified sources of information?" is the #1 most asked question among trusted messengers navigating conversations about media literacy.

Lateral reading or lateral searching is a strategy that helps us to determine for ourselves who is a credible source of information.

Report

Under the Microscope

Election Disinformation in 2022 and What We Learned for 2024
BY Emma Steiner

Report

Highlights and Accomplishments From 2022

Report

As a Matter of Fact: The Harms Caused by Election Disinformation Report

Donald Trump’s Big Lie is working, and we have to respond. Just as we came together last year, rising up to vote safely and securely in record numbers during a global pandemic, we must now rise up to stop election disinformation efforts in future elections.

Close

Close

Hello! It looks like you're joining us from {state}.

Want to see what's happening in your state?

Go to Common Cause {state}