That’s why we make preparing for elections one of our highest priorities. Since 2016, we have made massive investments in safety and security with more than 40,000 people working on these issues. We’ve taken steps to reduce the spread of misinformation and provide more transparency and control around political ads.
We’re continuing to connect people with details about voter registration and the election from their election officials through in-Feed notifications and our Voting Information Center in the US.
Our security teams investigate and take down coordinated networks of inauthentic accounts, Pages and Groups that seek to manipulate public debate.
We stop millions of fake accounts every day before they are even created. Facebook Protect further secures the accounts of elected officials, candidates and their staff.
We work with governments, law enforcement agencies, nonprofits, civil rights groups and other tech companies to stop emerging threats.
We’re fighting foreign interference and domestic influence operations, and we’ve removed over 150 networks of Coordinated Inauthentic Behavior since 2017.
We add warnings to posts that are debunked by fact-checkers, so you can decide what to read, trust and share.
We remove content that violates our Community Standards, including fake accounts and misinformation that may contribute to the risk of imminent violence or harm.
We started partnering with independent third-party fact-checkers in 2016 to review potential false news and content. Today we work with 90 partners covering over 60 languages.
When third-party fact-checkers label content as false, we significantly reduce its distribution on Facebook and Instagram so fewer people see it.
We’re committed to fighting voter interference and suppression. That’s why we remove content that includes incorrect voting information as well as ads that advise people not to vote.
Every political and social issue ad that runs on Facebook and Instagram is stored in a searchable Ad Library for seven years. We offer tools to help journalists, researchers and policymakers conduct analysis and learn more.
To run a political or social issue ad, advertisers must go through our authorization process, which includes proving who they are and where they live.
Ads about social issues, elections or politics include “Paid for by” disclaimers to show who’s behind the ad. US advertisers must now provide additional information, like a Federal Election Commission ID or Tax-ID number.
We show you the confirmed owner and locations of Facebook Pages, and we label state-controlled media in the US. For Instagram accounts with large audiences, you can see information such as the country where it is located.
CrowdTangle, our social media monitoring tool lets you see what presidential candidates are saying on Facebook and Instagram, as well as branded content they sponsored.
You can see how much presidential, senate and house candidates spent on ads leading up to the US 2020 elections.
Meta’s approach to the 2022 US midterm elections
We have a dedicated team focused on the 2022 midterms to help combat election and voter interference while helping people vote.
How Meta is Preparing for the Philippines’ 2022 General Election
To help keep people safe during the upcoming Philippine General election, we’ve built new products and developed stronger policies.
Community Standards Enforcement Report, First Quarter 2022
We're sharing metrics on how we enforced our policies from January through March 2022.