Combating sextortion and intimate image abuse

Sextortion is the threat to reveal intimate images to force you to do something you don't want to do. Sharing—or threatening to share—intimate images without consent is against Meta policies. Nobody should ever have to experience sextortion. We work to prevent this type of behavior, and when we become aware, we work to take action. This page includes information on how we work to fight this abuse and how you can take immediate action if this happens to you.

Meta’s work to combat sextortion

If someone tries to use a personal intimate image as a threat to make you do something you don’t want to do—like send money, additional images or have sexual contact—that is a crime known as sextortion. We have built safeguards and technology to help prevent and combat this abuse—and stop these criminals from causing harm.

Our policies and enforcement

We have strict rules against content or behavior that exploits people, including sharing or threatening to share someone’s intimate images. We encourage anyone who sees content they think breaks our rules to report it—and we have a dedicated reporting option to use if someone is sharing private images. When we become aware of this content, we work to take action.

We have specialized teams working on combating sextortion. We have identified patterns associated with this behavior, and built automated systems that detect and remove these accounts at scale. We also have dedicated teams that investigate and remove these criminals and report them to authorities, including law enforcement and the National Center for Missing and Exploited Children (NCMEC), when appropriate. We work with partners, like NCMEC and the International Justice Mission, to help train law enforcement around the world to identify, investigate and respond to these types of cases.

How we prevent unwanted contact

We work to protect people from sextortion by preventing unwanted contact, especially between adults and teens. We do this in a variety of ways:

  • We automatically set teens’ accounts under 16 (and under 18 in certain countries) to private when they join Instagram. We also don’t allow people who teens don’t follow to tag or mention them, or to include their content in Reels Remixes or Guides by default. These are some of the best ways to help keep young people from hearing from adults they don’t know, or that they don’t want to hear from. Private accounts, available to both teens and adults, also prevent other people from seeing friend/follow lists, which can be used as a lever by people trying to sextort others.
  • We restrict adults over 19 from sending private messages to teens who don’t follow them on Instagram. We also limit adults from messaging unconnected teens on Messenger.
  • We limit the type and number of direct messages someone can send to an account that doesn’t follow them on Instagram, restricting them to one text-only message until the recipient accepts their request to chat. This helps prevent people from receiving unwanted images, videos or repeated messages from people they don’t know.
  • We also introduced stricter default message settings for teens under 16 (and under 18 in certain countries). This means teens can’t be messaged or added to group chats by anyone they don’t follow or aren’t connected to on Instagram and Messenger. This is designed to help teens feel even more confident that they won’t hear from people they don’t know in their private messages.
  • We use technology to identify and prevent potentially suspicious adults from finding, following and interacting with teen accounts, and we don’t recommend teen accounts to these adults.

How we educate people

We educate teens when interacting with certain adults.

  • We let teens know when a potentially suspicious account has attempted to follow them, and encourage young users to be more cautious.
  • We have proactive prompts—or safety notices—that notify young people when an adult who has been exhibiting potentially suspicious behavior is interacting with them in messages. For example, if an adult is sending a large amount of friend requests to people under 18 we’ll alert the recipients within their messages and give them an option to block, report, or restrict the adult.

Expert partnerships

We are committed to working with expert partners to combat sextortion around the world and have been dedicated to this work for many years.

  • We worked with Thorn to launch Stop Sextortion in 2017 and partnered with them to adapt these resources onto our Safety Center, translating them into 60+ additional languages. We partnered again with their NoFiltr brand to create educational materials. These resources were designed to help reduce the shame and stigma surrounding intimate images, and empower teens to seek help and take back control if they’ve shared them or are experiencing sextortion.
  • Meta is also a founding member of the Lantern program, managed by the Tech Coalition, which enables technology companies to share signals about accounts and behaviors that violate their child safety policies. We provided the Tech Coalition with the technical infrastructure that sits behind the program and continue to maintain it. Participating companies can use this information to conduct investigations on their own platforms and take action. This is incredibly important because we know that predators don’t limit themselves to any one platform—so we need to work together to tackle this.
  • Meta continues to support in expanding its Global Network of Partners, over 90 victim advocates and non-profits around the world who run prevention campaigns and provide support to survivors of sextortion and intimate image abuse.

Tools and resources

Instagram and Facebook are founding members of Take It Down —a platform by NCMEC to proactively prevent young people’s intimate images, including AI-generated content, from spreading online. We provided financial support to NCMEC to develop Take It Down, building on the success of, a platform we developed that helps adults stop the spread of their intimate images online. We’ve made both Take It Down and StopNCII easily accessible on our apps when people are reporting potentially violating content.

We’ve also developed ways to help people control their own experience. For example, people can choose who can message them, and can block anyone they don’t want to hear from. We also encourage people to report content they think breaks our rules, and we prompt teens to report at relevant moments, such as when they block someone.

Something Went Wrong
We're having trouble playing this video.

We’ve developed more than 30 tools and features to help support the safety of teens and families across our apps, including supervision tools for parents and guardians, and specific education and resources about sextortion. Anyone seeking support and information related to sextortion can visit our education and awareness resources, including the Stop Sextortion resources, developed with Thorn.

Stop sextortion

Something Went Wrong
We're having trouble playing this video.

Stop sextortion resources help those seeking support and information. They provide immediate actions you can take if you or a friend is experiencing this as well as expert tips for teens, parents and guardians.

How to report threats and intimate images shared without permission to Meta

You can report nude or sexual photos or videos of yourself or threats to share these images or videos to our apps or technologies to prevent them from being reshared. Our teams review reports 24/7 in more than 70 languages and will take action on violating content and behavior. Learn how to report with the links below:

How to prevent a nude or sexual photo or video of yourself from being shared online

If you have a nude or sexual photo or video of yourself, there are tools to help you prevent it from being shared or reshared online:

How it works - Over 18

If you have an intimate image or video that was taken when you were over 18 and are concerned it will be shared or reshared online, you can take steps to help prevent further circulation through

Something Went Wrong
We're having trouble playing this video.

The tool uses hash-generating technology that assigns a unique hash value (a numerical code) to an image, creating a secure digital fingerprint. Tech companies participating in receive the hash and can use that hash to detect if someone has shared or is trying to share those images on their platforms.

Take It Down - Under 18

If you have a nude or sexual photo or video of yourself that was taken when you were under 18 and are concerned it will be shared or reshared online, you can take steps to help prevent further circulation through

Similar to, Take It Down assigns a unique hash value (a numerical code) to your image or video privately and without the image or video ever leaving your device or anyone viewing it. Once you submit the hash value to NCMEC, companies like ours can use those hashes to prevent the content from being posted on our apps in the future.

This option is for:

  • Teens under 18 years old who are worried their nude photo or video has been or will be posted online
  • Parents or trusted adults on behalf of a teen
  • Adults who are worried about nude imagery taken of them when they were under 18