Online child protection

At Meta, child protection is always a top priority.

Overview

We've developed a three-pronged, industry-leading approach to protecting young people online. First, we focus on preventing harm from happening in the first place. We do that by enforcing zero-tolerance policies and developing cutting-edge, preventative tools. Then, we make it easy to report potential harms, and we respond to take action. And always, we collaborate with global experts and industry partners to update our tools that keep young people safe.

Building a child protection ecosystem

Our work to safeguard children extends to the broader internet. We recognize the power of cross-industry collaboration to create a child safety ecosystem.

Since 2016, we have hosted regular child safety hackathons with NGOs. These events focus on coding and prototyping projects focused on making the internet a safer place for children. Likewise, our photo and video-matching technologies have been open source since 2019. By contributing back to the tech industry with this code, we hope to enable more companies to keep their services safe. In 2020, we collaborated with partners across the industry to establish Project Protect. This coalition is designed to protect young people online and guide the work of the Technology Coalition for the next 15 years.

Internally, we work closely with our safety advisors, which includes experts and NGOs from around the world. Our efforts include developing industry best practices, building and sharing technology to fight online child exploitation and supporting victim services.

Advancing technological interventions

Understanding how and why people share exploitative content is critical for combating it. We conduct a careful, in-depth analysis of such content to better inform our research.

We use what we learn from this data to deploy tools and launch new programs targeted at reducing the sharing of this abhorrent content. For example, we've deployed a search intervention aimed at reducing malicious searches for content.

On Instagram, we piloted the use of AMBER Alerts to allow people to see and share notices of missing children in their area. Since 2015, AMBER Alerts on Facebook have been successful in helping authorities find missing children.

Accountability and enforcement reports

We strive to be open and proactive in the way we safeguard users’ privacy, security and access to information online. Our quarterly Community Standards Enforcement Report includes:

Stop child sexual abuse material

We’re constantly updating and creating new child protection tools. We’ve expanded our work to detect and remove networks that violate our policies, and we've also made it easy to report violating content. Here are some product features (not visible to the user):

  • Once we identify potentially suspicious adults on Facebook and Instagram, we work to prevent them from discovering and connecting with teen accounts. This intervention can take a number of different forms. For example:
    • We won’t recommend young people’s accounts to them by removing them from “People You May Know.”
    • We won’t allow them to comment on young people’s posts, and they won’t be able to see young people's comments on other people’s posts.
    • If they find a young person’s account by searching for their username, they won’t have the option to friend or follow them.

Our child safety policies clearly state what is allowed on Facebook and Instagram. Violations include sharing otherwise innocent images of children with inappropriate captions, hashtags or comments.

Help protect children

We do not allow content that sexually exploits or endangers children. Yet sometimes people do post sexual images and videos of children. We know that reposting such content, even without malicious intent, re-victimizes the child. Sharing this content violates our policies, regardless of intent.

We are working with public awareness experts to launch a new PSA campaign that encourages people to stop and think before resharing those images online and to report them to us instead.

Reporting can prevent child sexual harassment content spreading and protects children from harm.

Cross-industry partnership to combat the spread of child sexual abuse material

The Take It Down portal, created by NCMEC with Meta’s support, is one step you can take to help remove online nude, partially nude or sexually explicit photos and videos taken before you were 18*.

Take It Down works by assigning a unique digital fingerprint, called a hash value, to nude, partially nude or sexually explicit images or videos of people under the age of 18. Participating online platforms can use hash values to detect posts of these images or videos on their services, and remove this content. This all happens without the image or video ever leaving your device; only the hash value will be uploaded to Take It Down and provided to NCMEC. If someone tries to post a matching image on one of the participating companies’ platforms, the company reviews the content on their platform to check if it violates their policies and takes action accordingly.

*If you have an intimate image or video you are concerned will be shared or re-shared and you are over 18, you can take steps to prevent further circulation through StopNCII.org.

Stop Sextortion

Sextortion is the threat to reveal intimate images to get you to do something you don't want to do. Stop sextortion resources help those seeking support and information.

Our policies

We do not allow content that sexually exploits or endangers children. When we become aware of apparent child sexual exploitation, we report it to NCMEC (National Center for Missing & Exploited Children). NCMEC coordinates with law enforcement authorities from around the world.

We also work with external experts to discuss and improve our policies and enforcement around online safety issues.

People share nude images of children for a variety of reasons, including out of outrage or in poor humor. We know that sometimes people share nude images of their own children with good intentions. However, we generally remove these images because of the potential for abuse by others. This helps avoid the possibility of other people reusing or misappropriating the images.

The Facebook Community Standards explain what is and is not allowed on the app. Our policies on Child Exploitation are a part of these guidelines, and are actively enforced.

Related Resources