At Meta, we take a comprehensive approach to making our technologies a better place for everyone, especially teens

Approach to Safety

Approach to Safety

We take a four-pronged approach to keeping people safe on our technologies

Policies: We’re committed to making Meta technologies safe. We remove content that could contribute to a risk of harm to the physical security of persons. Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on our technologies. The Community Standards provide a foundation for the voices using Facebook, Instagram, Threads and Messenger each day. They define what is and isn’t okay to share. They are not meant to set a tone of restriction and silence, but rather to create a safe environment for expression. Please read more in our Meta Community Standards.

Detection and enforcement: We have around 40,000 people overall working on safety and security, and we have invested over US$20 billion since 2016.

User tools and controls: Meta has developed more than 50 tools and features to help support teens and their parents in having a safe online experience on our apps.

Education and partnerships: Our youth wellbeing program in Singapore in partnership with EYEYAH! has engaged over 15,000 teens, parents and educators. We partner with SG Her Empowerment (SHE) to help victims get support for online harms.

Approach to Youth Safety

Approach to Youth Safety & Wellbeing

We require that people who use our apps are aged 13 years and above. We want teens to have safe, age-appropriate experiences on our apps.

Teen Accounts: We are introducing Instagram teen accounts in Singapore this year to reassure parents that teens are having safe experiences with built-in protections automatically. Teen accounts will limit who can contact teens and the content they see, and help ensure the teens' time is well spent. Teens under 16 will need a parent's ermission to change any of the built-in protections to be less strict within teen accounts. For more information, please see here.

Innovation in Age Verification: In addition to requesting someone to upload their ID for age verification, Meta is partnering with Yoti, a company that offers privacy-preserving ways to verify age. You can choose to upload a video selfie to verify your age. If you choose this option, you’ll see instructions on your screen to guide you. After you take a video selfie, we share the image with Yoti, and nothing else. Yoti’s technology estimates your age based on your facial features and shares that estimate with us. Meta and Yoti then delete the image. The technology cannot recognize your identity – just your age. Meta also uses AI to understand if someone is a teen or an adult. AI helps us prevent teens from accessing Facebook Dating, adults from messaging teens and helps teens from receiving restricted ad content, for example. Our goal is to expand the use of this technology more widely across our technologies. For more information please see here.

Parental Approval at the OS/Appstore Level: When a teen wants to download an app, the operating system (OS) or app store should be required to notify their parents so they can decide if they want to approve it, much like when parents are notified if their teen attempts to make a purchase. Placing the point of approval within the OS or app store simplifies the process and leverages optional approval systems already offered by app stores. With this solution, parents can also easily verify the age of their teen, which helps apps to place teens in the right experience for their age. Furthermore, because in some countries parents already provide official identifications like Government IDs to app stores when they purchase a teen’s phone and set up their account, it avoids this sensitive information having to be shared with multiple apps to achieve the same outcome.

Transparency

Transparency

We recognise our responsibility at Meta to protect the safety of people who use our services. It is inherent and essential to our business: Singaporeans and other people and businesses globally will only continue to use our technologies if they have positive, meaningful and safe experiences.

Our Transparency Center contains the policies we have in place to keep people safe, i.e. the Community Standards, and we publish the Community Standards Enforcement Report on a quarterly basis to let people see how well we are enforcing our policies and help them track our progress on making Facebook and Instagram safe and inclusive.

We also recognise the importance of collaboration between the technology industry, governments and civil society to combat online harms. As such, we support the Singapore Online Safety Code of Practice as a positive step in facilitating this collaboration by providing greater transparency about the efforts technology companies have made towards online safety. We believe that transparency of such information will help inform public discourse and policy development.

Meta's Online Safety Code of Practice Annual Reports

Programs

Programs

Our programs and partnerships are aimed at putting expert-backed resources and tools into the hands of parents, teens and educators, as well as supporting the work of local experts and community organisations on online safety in Singapore, and getting important signals from them on how we can do better.

EYEYAH! x Meta Youth Digital Wellness Program: A peer-to-peer ‘ambassador program’ launched by Meta and EYEYAH! in 2023 to help teens in secondary schools take control of their social media experience. Using eye-catching illustrations and animations to spark conversation, the learning journey includes educating teens on tools they can use to facilitate a positive social media experience. The programme was rolled out across 10 schools in Singapore in 2024, reaching 15,000 students and providing access to online resources and toolkits to gain knowledge and understanding of online harms, as well as preventative and safety tools.

Digital for Life Festival (year to year): In 2023, Meta supported the national Digital for Life movement and participated in the annual Festival, featuring our youth digital wellness program with EYEYAH!, reaching up to 100,000 attendees with free toolkits from Meta for parents and teachers on how to facilitate conversations with teenagers about online safety.

Quarterly Parents’ Webinars: Meta supports the Infocomm Media Development Authority (IMDA)’s 2024 priorities for Digital Parenting Programming, by hosting quarterly webinars for parents on “Navigating the digital world with your teen". The webinar gives parents and guardians an introduction on how they can guide their teens in being safe online, with a focus on Instagram. This includes an overview and interactive session on policies, tools and resources that apply to teen online safety and an open discussion session in which parents can ask the experts in Meta technologies

Bi-annual Learning Journey with MOE Educators: Meta supports the Ministry of Education with bi-annual learning journeys to Meta. We provide a deep dive session to provide educators with an overview on our policies, tools and resources for teens and online safety. We provide open questions and discussion to hear feedback and emerging issues that educators are seeing in their classrooms.

APAC Youth Safety Summit: Meta consulted with a regional group of youth well-being and safety experts from August 28-30, 2023, including members from civil society organisations in Singapore and international organisations to engage in dialogue on products, policies, and resources from Meta for parents and youth.

Partnerships

eyeyah
she
Touch Community Service