Skip to content

We are committed to protecting your voice and helping you connect and share safely.

When we’re open to hearing each other with tolerance and respect, we all benefit.

Social media has enabled more voices to be heard, but some people use it to do harm.

That’s why we have Community Standards that specify what’s allowed on our apps, and we remove anything that breaks these rules. For content that doesn’t violate our rules but has been rated false by independent fact-checkers, we reduce its distribution to help prevent it from going viral. We also provide context about what you see so you can make your own decisions on what to read, trust and share.


We remove hate speech, harassment, threats of violence and other content that has the potential to silence others or cause harm.

Community Standards

We consult with experts around the world to review and regularly update our standards on what is and is not allowed on Meta.

Investments in Safety

We've more than quadrupled our safety and security teams to more than 40,000 people since 2016.

Finding Violating Content

We detect the majority of the content we remove before anyone reports it to us.

Blocking Fake Accounts

We block millions of fake accounts from being created every day.

Giving People Control Over What They See

We give you control over your experience by allowing you to block, unfollow or hide people and posts.

Tools to Prevent Bullying

You can moderate comments on your posts, and on Instagram, we warn you if you’re about to post something that might be offensive so you can reconsider.

a graphic illustration of a magnifying glass focusing on a chart

How We Improve

Learn how we update our Community Standards, measure results, work with others, and prioritize content for review.


We work to limit the spread of misinformation and give you context to make your own decisions on what to read, trust and share.

Third-Party Fact-Checking Program

Today, we work with more than 90 partners covering over 60 languages around the world to review potentially false content.

Labeling Misinformation

We include warnings on posts that are rated false so that you can decide what to read or share.

Preventing Misinformation from Going Viral

When third-party fact-checkers label content false, we significantly reduce its distribution so fewer people see it.

a graphic illustration of a Facebook App post with a heart monitor and caution sign

Combating COVID-19 Misinformation

Since the World Health Organization declared COVID-19 a global public health emergency, we’ve been working to connect people to accurate information from health authorities and taking aggressive steps to stop misinformation and harmful content from spreading.


We won’t always get it right, so we invite people to appeal our content decisions. We also track and share our progress in making Meta's apps safer.

User Review and Appeal

People can report content to us or appeal certain decisions if they think we made a mistake in taking something down.

Oversight Board

This global body of experts will independently review Meta’s most difficult content decisions, and their decisions will be binding.

a graphic illustration of documents and a chart

Community Standards Enforcement Report

Every quarter, we release a report providing metrics on how we’re doing enforcing our content policies.


We proactively detect more than 95% of hate speech on Facebook that we remove before anyone reports it to us.

Two people with arms locked, looking at phone.


Marking This Year’s Safer Internet Day

We’re launching a series of educational campaigns to help give people the tools and knowledge to enjoy a safer and secure internet.

Teal graphic with magnifying glass observing chart.


Community Standards Enforcement Report, First Quarter 2022

We're sharing metrics on how we enforced our policies from January through March 2022.

Drawing representation of a mobile notification on Facebook.


Expanding Counterspeech Initiatives Into Pakistan and the UK

When someone in Pakistan or the UK searches on Facebook for content related to organized hate or violent extremism, they will now be redirected to resources and support.