Suicide prevention

We care deeply about safety and well-being across our platforms. Since 2006, we've worked with suicide prevention experts to support the Meta community.


Meta provides technology that helps connect people to resources and experts who can help. Our Emotional Health Hub offers an array of expert mental health tips and education. Find tools that may be helpful relating to suicide, anxiety, depression, managing well-being, both for you and those around you.

Prevent suicide.

Experts believe that contact from people who care can help prevent suicide. Meta can help provide that connection and point you to resources offering support and expertise on preventing suicide. Whether you're struggling on your own or concerned about someone else, we hope these resources can help.

Suicide prevention resources

For me

For friends

For me

If you're struggling through a difficult time, know you’re not alone. We can connect you to people who can help.

For friends

You may provide a vital link of support for a friend or family member in distress. Review tips from suicide prevention experts and see what you can do.

Immediate help

A suicide helpline can offer an immediate connection for you or a friend. Contacting an expert when in distress can make a huge difference. Call a local hotline in your country for help.

Our suicide prevention efforts

Thanks to the relationships that Meta helps build, we can help connect people who may be in distress with supportive friends and knowledgeable experts. Across the world, there's one death by suicide every 40 seconds. It is the second-leading cause of death for those 15 to 29 years old. The best tool for preventing suicide comes from meaningful connections. At Meta, we want to help be a solution.

Since 2006, just two years after then-Facebook first launched in 2004, we've worked with experts in suicide prevention and safety to support communities across Meta apps. We continue to regularly consult with these experts and seek input on current research and best practices, to help make sure we’re always putting our community’s safety and well-being first.

Something Went Wrong
We're having trouble playing this video.

Our policies

We care deeply about the safety of our community, and with the advice of experts, we set policies for what is and isn’t allowed on Facebook and Instagram. We’ve never allowed people to celebrate or promote self-harm or suicide, and we also remove fictional depictions of suicide and self-harm, as well as content that shows methods or materials. We do, however, allow people to discuss suicide and self-injury because we want Facebook and Instagram to be places where people can share their experiences, raise awareness about these issues, and seek support from one another. And while we do allow people to share content discussing their own experiences with these issues, we use technology to try not to recommend it. We also show a message offering support to anyone who tries to search for terms related to suicide or self-harm and, when we become aware someone has posted about these issues, we'll provide resources and direct them to local organizations who could support.

We work continuously with global experts to strike the delicate balance of giving people space to share their experiences and seek support and protecting others from seeing potentially harmful content.

How we work:

Providing resources

When someone expresses thoughts of suicide, it can be critical to get help as quickly as possible. Suicide prevention resources are available on Facebook and Instagram. Resources developed with leading mental health organizations and input from those with personal experience helps us continually improve our response.

Machine learning

Using machine learning technology, Meta has expanded our ability to identify possible suicide or self-injury content and in many countries we’re able to use this technology to get timely help to people in need. This technology uses pattern-recognition signals, such as phrases and comments of concern, to identify possible distress.

We use artificial intelligence to prioritize the order our team reviews reported posts, videos and livestreams. This ensures we can efficiently enforce our policies and get resources to people quickly. It also lets our reviewers prioritize and evaluate urgent posts, contacting emergency services when members of our community might be at risk of harm. Speed is critical.

By using technology to prioritize and streamline reports, we can escalate the content to our Community Operations team, who can quickly decide whether there are policy violations and whether to recommend contacting local emergency responders. We will continue to invest in technology to better serve our community.

Reporting posts

Meta users can report posts they feel indicate someone is thinking about suicide. Trained members of our Community Operations team will review these reports and may connect the poster with support resources. When we think there is an imminent risk of harm, we alert first responders/emergency services, so they can conduct a wellness check - actions which have saved lives.

Rapid technology

Meta's technology to identify possible suicide and self-injury integrates into both Facebook and Instagram posts and Facebook and Instagram Live. If it appears someone in a live video is considering self-harm, people watching can reach out directly to the person and report the video to us.

Whether a post is reported by a concerned friend or family member or identified by machine learning, the next step is the same: review. A member of the Community Operations team reviews the report for imminent risk of self-harm or policy violations. In serious cases, we work with emergency services to conduct a wellness check. Thanks to Meta technology, we've helped first responders quickly reach people in distress.


“When someone expresses suicidal distress, it provides family, friends and even Facebook and Instagram the opportunity to intervene. If people can’t share their pain, or it is shared and then removed, we’ve missed a chance to save someone’s life. We train people to listen for this in conversations and to allow people to keep talking because we know that it is one way to help them through a crisis. Social media platforms allow us to do this in a way that brings many people to help very quickly.”

-Daniel J. Reidenberg, Psy.D., FAPA, Executive Director,

“Mental illness and thoughts about suicide are just not something we talk about OPENLY. Yet talking and connecting is crucial to helping prevent depression and suicide. The tools Facebook is rolling out, aim both at people who are expressing suicidal thoughts and also guide concerned friends or family members to resources and alternatives and appropriate interventions.”

-Anna Chandy, Chairperson - Trustees, The Live Love Laugh Foundation - India

“The Finnish Association for Mental Health is pleased to work with Facebook to provide support and resources for people who are feeling vulnerable and at risk of suicide and their close ones. To have this support available is important because when in crisis, people often don´t have strength or courage to seek help and can even find it hard to realize that they need help. To have resources and contact information for experts made available in the language people speak is crucial.”

-Satu Raappana-Jokinen, Manager of Online Crises services, The Finnish Association for Mental Health

“Any initiative that helps raise awareness and prevent suicide is essential to reduce the tragic statistics showing that one Brazilian dies every 45 minutes. Some people can identify signals that a friend is thinking about suicide but do not know how to help. In these cases, tools like the one launched by Facebook in Brazil can directly and indirectly help this cause that has been embraced by CVV for more than 50 years and serve as a model for other organizations to overcome ‘tabus’ and embrace further actions in this subject.”

-Carlos Correia, Volunteer, Centro de Valorização da Vida - Brazil