Skip to content
PRIVACY PROGRESS UPDATE

Since 2019, we've overhauled privacy at Meta, investing USD 5.5 billion in a rigorous privacy programme that includes people, processes and technology designed to identify and address privacy risks early and embed privacy into our products from the start.

01. HOW WE DO IT

We've grown our product, engineering and operations teams focused primarily on privacy across the company from a few hundred people at the end of 2019 to more than 3,000 people at the end of 2023.

Photo of Chief Privacy Officer of Product Michel Protti on Meta’s Menlo Park campus

"We realised we needed an order of magnitude greater investment in privacy – the summer of 2019 was that moment. The degree of change for Meta has been massive – it's required foundational changes to our people, processes and technical infrastructure, and oversight and accountability. And we continue to invest in protecting people's data as systems, technology and expectations evolve."


—Michel Protti, Chief Privacy Officer for Product

Governance

Our work on privacy is underpinned by our internal governance structures that embed privacy and data-use standards across the company's operations.

As we continue to integrate privacy across the company, we've embedded privacy teams within product groups that deepen the understanding of privacy considerations by providing expertise within each product group. These teams enable front-line ownership of privacy responsibilities across our products.

Led by Chief Privacy Officer, Product, Michel Protti, the Privacy and Data Practices team is made up of dozens of teams, both technical and non-technical, focused on setting and maintaining privacy strategies and enabling the rest of the company to adhere to them.

The Privacy and Data Practices Team is at the centre of our company's efforts to maintain a comprehensive privacy programme. Its mission – to instill responsible data practices across Meta – guides this work by ensuring that people understand how Meta uses their data and trust that our products use their data responsibly.

The Privacy and Data Practices Team is just one organisation among many across the company that is responsible for privacy. There are thousands of people in different organisations and roles across Meta, including public policy and legal, who are working to embed privacy into all facets of our company operations. Getting privacy right is a deeply cross-functional effort, and we believe that everyone at Meta is responsible for that effort.

Led by Erin Egan, Vice President and Chief Privacy Officer, Policy, the Privacy and Data Policy team leads our engagement in the global public discussion around privacy, including new regulatory frameworks, and ensures that feedback from governments and experts around the world is considered in our product design and data use practices.

To do so, the Privacy and Data Policy team consults with these groups through a variety of consultation mechanisms, including:

  • Privacy advisory groups that are in-region experts or have specialist expertise on topics such as ads or Extended Reality ("XR")
  • Workshops (including Data dialogues) and other convenings on complex privacy questions
  • Ongoing consultations with experts as part of our product development process, and to inform our long-term approach to privacy
  • Funding experts' research and advocacy around privacy, including our privacy research and XR responsibility grants
  • Funding experts to do "deep dive" consultations with us on how we can improve aspects of our privacy programme
  • Support cross-industry organisations such as Trust, Transparency and Control Labs (TTC Labs) and Open Loop that develop innovative solutions that help us and others build technology responsibly

We also host a regular conversation series and an annual privacy expert flyout with leading privacy experts from around the world to discuss a range of pressing privacy policy topics.

The Privacy Legal team is embedded in the design and ongoing execution of our programme and advises on legal requirements during the course of our privacy review process.

The Privacy Committee is an independent committee of our Board of Directors that meets at least quarterly to ensure that we're meeting our privacy commitments. The Committee is comprised of independent directors with a wealth of experience serving in similar oversight roles. At least once per quarter, they receive briefings on, among other things, the global policy landscape, the state of our privacy programme and the status of the independent third-party assessment of our privacy programme.

Internal audit brings independent assurance on the overall health of our privacy programme and the supporting control framework.

Privacy education

Part of ensuring that everyone understands their role in protecting privacy at Meta is driving continuous privacy learning and education that spans training and internal privacy awareness campaigns.

A core component of our privacy education approach is delivered through our privacy training. Our privacy training covers the foundational elements of privacy and is designed to help everyone here at Meta develop the ability to recognise and consider privacy risks. Through its e-learning format, both our annual privacy training and our privacy training courses for new hires and new contingent workers provide scenario-based examples of privacy considerations aligned with our business operations and include an assessment to test the understanding of the relevant privacy concepts. These trainings are updated and deployed annually to ensure that relevant information is included in addition to core concepts.

Alongside our foundational required privacy training, we also maintain a catalogue of all known privacy training deployed across Meta that spans topics relevant to people in specific roles.

Another way in which we drive privacy education is through regular communication to employees. In addition to our privacy training courses, we deliver ongoing privacy content through internal communication channels, updates from privacy leadership, internal Q&A sessions and a dedicated Privacy Week.

During our dedicated Privacy Week, we drive cross-company focus on privacy, feature internal and external speakers, and highlight key privacy concepts and priorities through engaging content and events.

When we participate in external privacy events such as Data Privacy Day or our annual privacy expert flyout, we drive internal awareness and engagement through internal channels to ensure that everyone has an opportunity to participate and learn about privacy.

Regulatory readiness process

We have a dedicated team whose job is to help ensure that we comply with global privacy and data regulations. Our regulatory readiness process is structured according to key privacy topics or "privacy areas" (e.g. youth, sensitive data, consent etc.) to help ensure that we're addressing privacy requirements holistically.

Privacy risk identification and assessment

We've created our Privacy Risk Management programme to identify and assess privacy risks related to how we collect, use, share and store user data. We leverage this process to identify risk themes, enhance our privacy programme and prepare for future compliance initiatives.

Safeguards and controls

We've designed safeguards, including processes and technical controls, to address privacy risks. As a part of this effort, we conduct internal evaluations on both the design and effectiveness of the safeguards for mitigating privacy risk.

Issues management

We've established a centralised issue management function to facilitate self-identification and remediation of privacy issues. This process spans the privacy issue management lifecycle from intake and triage, remediation planning and closure with evidence.

Privacy red team

We've established a privacy red team whose role is to proactively test our processes and technology to identify potential privacy risks. The Privacy Red Team assumes the role of external or internal parties attempting to circumvent our privacy controls and safeguards, which helps us proactively identify areas where we can improve our control environment.

Incident management

No matter how robust our mitigations and safeguards, we also need a process to (1) identify when an event potentially undermines the confidentiality, integrity or availability of data for which Meta is responsible, (2) investigate those situations and (3) take any needed steps to address gaps we identify.

Our incident management programme operates globally to oversee the processes by which we identify, assess, mitigate and remediate privacy incidents. Although the privacy and data practices team leads the incident management process, privacy incidents are everyone's responsibility at Meta. Teams from across the company, including legal and product teams, play vital roles. We continue to invest time, resources and energy in building a multi-layered programme that is constantly evolving and improving and we highlight three components of our approach below.

We take a layered approach to protecting people and their information – which includes implementing safeguards designed to catch bugs proactively, before they can become a problem. Given the scale at which we operate, we have invested heavily in building and deploying a wide range of automated tools that are intended to help us identify and remediate potential privacy incidents as early and quickly as possible. These automated systems are designed to detect incidents in real time to facilitate rapid response.

Of course, no matter how capable our automated systems become, the oversight and diligence of our employees always plays a critical role in helping to proactively identify and remediate incidents. Our engineering teams regularly review our systems to identify and fix incidents before they can impact people.

Since 2011, we have operated a bug bounty programme in which external researchers help improve the security and privacy of our products and systems by reporting potential security vulnerabilities to us. The programme helps us scale detection efforts and fix issues faster to better protect our community, and the bounties we pay to qualifying participants encourage more high-quality security research.

Over the past 10 years, more than 50,000 researchers have joined this programme and around 1,500 researchers from 107 countries have been awarded bounties.

While we've adopted a number of protections to guard against privacy incidents such as unauthorised access to data, if an incident does occur, we believe that transparency is an important way to rebuild trust in our products, services and processes. Accordingly, beyond fixing and learning from our mistakes, our incident management programme includes steps to notify people where appropriate, such as a post in our Newsroom or our Privacy Matters blog about issues affecting our community, or working with law enforcement or other officials to address incidents we find.

Third-party oversight

Third parties are external partners who do business with Meta but aren't owned or operated by Meta. These third parties typically fall into two major categories: those who provide a service for Meta (such as vendors who provide website design support) and those who build their businesses around our platform (such as app or API developers). To mitigate privacy risks posed by third parties that receive access to personal information, we developed a dedicated third-party oversight and management programme, which is responsible for overseeing third-party risks and implementing appropriate privacy safeguards.

We've also created a third-party privacy assessment process for service providers to assess and mitigate privacy risk. Our process requires that these service providers are also bound by contracts containing privacy protections. Their risk profile determines how they are monitored, reassessed and, where appropriate, which enforcement actions to take as a result of violations, including termination of the engagement.

We've designed a formal process for enforcing and offboarding third parties who violate their privacy or security obligations. This includes standards and technical mechanisms that support better developer practices across our platform, including:

  • Data Use Checkup (DUC): Procedures and infrastructure designed to ensure that third-party developers complete an annual Data Use Checkup (DUC), in which developers certify to the purpose for and use of each type of personal information that they request or continue to have access to, and that each purpose and use complies with applicable terms and policies. We've introduced new questions and improved logic to ensure greater accuracy in responses and better comprehension from developers. We've also created new tooling to centralise developer communications and requests for additional information into a single location.
  • Monitoring developer compliance: We've developed technical and administrative mechanisms to monitor developers' compliance with our Platform Terms on both an ongoing and periodic basis. When we detect a violation, we take standardised enforcement actions, which, among other factors, take into account the severity, nature and impact of the violation, the developer's malicious conduct or history of violations, and applicable law when determining the appropriate enforcement action to take.
  • Data security standards: We've also developed data security principles based on industry standards for developers to drive better security practices across our platform and the developer ecosystem more broadly.
  • Developer Trust Centre: We launched the Developer Trust Centre, a central hub on the Meta for Developers site that brings together material for third-party developers on data privacy, data security, Platform Terms and monitoring mechanisms that they interact with, such as App Review, App Re-Review, DUC and the Data Protection Assessment (DPA).

External data misuse

Our External data misuse team is dedicated to detecting, investigating and blocking patterns of behaviour associated with scraping. Scraping is the automated collection of data from a website or app and can be either authorised or unauthorised. Using automation to access or collect data from Meta's platforms without our permission is a violation of our terms of service.

We continue to invest in infrastructure and tools to make it harder for scrapers to collect data from our services and more difficult to capitalise off of it if they do. Examples of these investments include rate limits and data limits. Rate limits cap the number of times anyone can interact with our products in a given amount of time, while data limits keep people from getting more data than they should need to use our products normally.

We leverage internally generated user and content identifiers after we observed that unauthorised scraping often involves guessing or purchasing such identifiers. We also use new pseudonymised identifiers that help deter unauthorised data scraping by making it harder for scrapers to guess, connect and repeatedly access data.

We've blocked billions of suspected unauthorised scraping actions per day across Facebook and Instagram, and we've taken a variety of actions against unauthorised scrapers, including disabling accounts and requesting that companies hosting scraped data delete it.

Privacy review

The privacy review process is a central part of developing new and updated products, services and practices at Meta. Through this process, we assess how data will be used and protected as part of new or updated products, services and practices. We review an average of 1,200 products, features and data practices per month across the company before they ship to assess and mitigate privacy risks.

As part of the process, a cross-functional team of privacy experts evaluates potential privacy risks associated with a project and determines if there are any changes that need to happen before project launch to mitigate those risks. If there is a disagreement on the assessment of applicable risks or the proposed product mitigations, the process requires teams to escalate to product and policy leadership and ultimately the CEO for further evaluation and decision.

The development of our new or modified products, services or practices through the privacy review process is guided by our internal privacy expectations, which include:

  1. Purpose limitation: Only process data for a limited, clearly stated purpose that provides value to people.
  2. Data minimisation: Collect and create the minimum amount of data required to support clearly stated purposes.
  3. Data retention: Only keep data for as long as it is actually required to support clearly stated purposes.
  4. External data misuse: Protect data from abuse, accidental loss and access by unauthorised third parties.
  5. Transparency and control: Communicate product behaviour and data practices proactively, clearly and honestly. Whenever possible and appropriate, give people control over our practices.
  6. Data access and management: Provide people with the ability to access and manage the data that we have collected or created about them.
  7. Fairness: Build products that identify and mitigate risk for vulnerable populations, and ensure that value is created for people.
  8. Accountability: Maintain internal process and technical controls across our decisions, products and practices.

We've also invested in verification reviews and a centralised platform to support operating the Privacy Review process at scale:

  • Centralised platform: We have invested in a centralised platform that is used throughout the privacy review process and enables teams to manage all aspects of their reviews, requirements and privacy decisions. It is a central repository that enables teams to manage all aspects of their Privacy Review process, including to search and manage the external privacy commitments that we've made as a company.
  • Verification reviews: Privacy review includes a technical review phase to verify and document the technical implementation of the privacy requirements and mitigations, as well as applicable commitments for each launch. We've designed a technical implementation review to analyse, verify and document the technical implementation of privacy mitigations and commitments. This process, integrated with the tools we use to build software at Meta, enables verification that what was agreed on in Privacy Review was implemented.
02. PRIVACY PRODUCT OUTCOMES

We continuously invest in product innovations that deliver privacy and controls that benefit our users.

Komal Lahiri, VP Privacy Review

"Billions of people trust us with their privacy everyday. Privacy Review is key to honouring that trust and helps ensure that we innovate responsibly. Our primary goal is to show our users and regulators that we're meeting our privacy obligations and getting this right."


—Komal Lahiri, VP Privacy Review

We put protecting users' privacy at the heart of how we build and continuously update our products. We do that by building default settings and controls to make it easy for users to set the level of privacy they are most comfortable with. We also do it by putting privacy at the center of how we develop new products.

Since 2016, Messenger has had the option for people to turn on end-to-end encryption. In 2023, we began to roll out default end-to-end encryption for all personal chats and calls on Messenger and Facebook.

This has taken years to deliver because we've taken the time to get it right. Our engineers, cryptographers, designers, policy experts and product managers have worked tirelessly to rebuild Messenger features from the ground up. Enabling end-to-end encryption on Messenger meant fundamentally rebuilding many aspects of the application protocols to improve privacy, security and safety while simultaneously maintaining the features that have made Messenger so popular. Our approach was to leverage prior learnings from both WhatsApp and Messenger's secret conversations, and then iterate on our most challenging problems such as multi-device capability, feature support, message history and web support. Along the way, we introduced new privacy, safety and control features such as app lock and delivery controls that let people choose who can message them, and improved existing safety features such as report, block and message requests.

At its core, end-to-end encryption is about protecting people's communications, so they can feel safe expressing themselves with their friends and loved ones. We worked closely with outside experts, academics, advocates and governments to identify risks and build mitigations to ensure that privacy and safety go hand in hand. We commissioned an independent human rights impact assessment and published an extensive white paper on Meta's approach to safer private messaging on Messenger and Instagram DMs.

Ray-Ban Meta smart glasses let you snap a photo or video clip from your unique point of view, listen to music or take a call, and use smart features as they're rolled out – all without having to pull out your phone. Ray-Ban Meta smart glasses have been redesigned with a higher quality camera, improved audio and microphone systems, and new features, such as live-streaming and built-in Meta AI, so you don't have to choose between capturing the moment and experiencing it.

Ray-Ban Meta smart glasses were built with privacy at their core, and serve as a clear proofpoint of our commitment to responsible innovation and privacy by design. We've incorporated stakeholder feedback – which we obtained early on from the moment we launched Ray-Ban Stories – in meaningful and tangible ways.

  • The capture LED is now more noticeable and visible with a differentiated signalling pattern – solid to blinking – for longer-duration capture (video recording, live-streaming).
  • We also introduced a tamper detection feature to prevent users from recording while the capture LED is fully covered. If the capture LED is fully obscured, the user will not be able to utilise the camera and will be notified to remove the obfuscation before proceeding.
  • The Meta View companion app continues to provide easy access to privacy settings to manage information and additional data sharing with Meta.

We launched new generative AI features, including AI stickers, image editing with AI, our AI assistant known as Meta AI spanning across our apps, and 28 new AI characters played by cultural icons and influencers. As part of the launch of these features, we included a Generative AI Privacy Guide and other transparency resources for people to understand how we built our AI models, how our AI features work, and what controls and data privacy rights they have.

Last year, we updated our "Why am I seeing this?" tool, which aims to help people understand why they're seeing the ads they do on Facebook and Instagram feeds. One key update was summarising information into topics about how activity both on and off our technologies – such as liking a post on a friend's Facebook Page or visiting a sports website – may inform the machine learning models we use to shape and deliver the ads seen. We also introduced new examples and illustrations explaining how our machine learning models connect various topics to show relevant ads. Additionally, we introduced more ways for users to find our ad controls, providing the availability to access ads preferences from additional pages in the "Why am I seeing this ad?" tool.

We developed the Meta Content Library (MCL) and API, new research tools that provide qualified researchers with access to additional publicly available content across Facebook and Instagram, in privacy-protective ways.

These tools provide researchers with access to near real-time public data, including content from Pages, Groups and Events on Facebook, as well as from creator and business accounts on Instagram. Details about the content, such as the number of reactions, shares, comments and, for the first time, post view counts, are also available. Researchers can search, explore and filter that content on both a graphical User Interface (UI) or through a programmatic API.

We partnered with the Inter-University Consortium for Political and Social Research (ICPSR) at the University of Michigan's Social Media Archive (SOMAR) initiative to share public data from our platforms, where access to restricted data is tightly controlled, and authorised users agree to strict data use and confidentiality terms to gain access.

Our work to communicate transparently includes providing external education to improve people's understanding and awareness of our practices and ensuring that information is accessible and easy to find.

  • Privacy Policy, which details how we collect, use, share, retain and transfer information, as well as what rights and controls people have over their privacy.
  • Privacy Centre, where people can go to better understand our practices so they can make informed decisions about their privacy in a way that is right for them. Through education and access to privacy and security controls, we address some of the most common privacy concerns from the billions of people who spend their time with us everyday. Privacy Centre has several modules, including sharing, collection, use, security, youth, generative AI and ads, to directly connect an issue or concern with the relevant privacy and security controls we've built across our apps and services over the years.
  • Data and Privacy section of Newsroom, where we provide more information about how we've approached privacy in the context of particular features or issues.

To provide greater transparency and control to people, we've developed a number of privacy tools for people to understand what they share and how their information is used, including:

  • Privacy Checkup: Guides people through important privacy and security settings on Facebook to help strengthen account security and manage who can see what they share and how their information is used. Privacy Checkup has five distinct topics to help people control who can see what they share, how their information is used and how to strengthen their account security.
    • Who can see what you share helps people review who can see their profile information, such as their phone number and email address, as well as their posts.
    • How to keep your account secure helps people strengthen their account security by setting a stronger password and turning on two-factor authentication.
    • How people can find you on Facebook lets people review ways in which people can look you up on Facebook and who can send you friend requests.
    • Your data settings on Facebook lets people review the information they share with apps they've logged in to with Facebook. They can also remove the apps they no longer use.
    • Your ad preferences on Facebook provides information about how ads work on our Products, lets people decide what profile info advertisers can use to show them ads, and lets them control who can see their social interactions, such as likes, alongside ads.
  • Activity off Meta technologies: Provides a summary of activity that businesses and organisations share with us about people's interactions, such as visiting their apps or websites, and gives people the option to disconnect their past activity from their account.
  • Manage activity: Allows people to manage posts in bulk with filters to help you sort and find the content they are looking for, such as posts with specific people or from a specific date range. It also includes a deletion control that provides a more permanent option, so people can move old posts in bulk to the bin. After 30 days, posts sent to the bin will be deleted, unless they choose to manually delete or restore them before then.
03. Technical privacy investments

We're investing in technological innovations that deliver privacy benefits for our users.

"Investing in infrastructure helps ensure that privacy is inherent in everything we build. It enables us to continue building innovative, valuable products for people in a privacy-safe way."


Michel Protti, Chief Privacy Officer for Product

Photo of two people collaborating

We continue to build privacy-aware infrastructure – scalable and innovative infrastructure solutions that enable engineers to more easily address privacy requirements as they build products. Privacy-aware infrastructure allows us to increasingly use automation, rather than rely primarily on people and manual processes, to verify that we are meeting our privacy responsibilities.

We're proactively reducing the amount of user data that we collect and use by deploying innovative tools and technology across Meta. We continue to invest in privacy-enhancing technologies (PETs) – technologies based on advanced cryptographic and statistical techniques that help to minimise the data we collect, process and use and have been working to open source this work in cases where useful for the broader ecosystem, including on PETs for AI through PyTorch. Additionally, our investments in PETs helped enable a new cryptographic security feature on WhatsApp that helps to verify that your connection and conversation is secure based on key transparency. This feature reduces the possibility of a third party impersonating the person or business a user wants to connect and share encrypted messages with, by checking the validity of public keys in the conversation against a server-side directory that stores public keys with user information and then providing a publicly available, privacy-preserving audit record for anyone to verify that data has not been deleted or modified in the directory.

Similarly, we developed a framework for code and asset removal which guides engineers through deprecating a product safely and efficiently. Deprecating products is a complex feat involving internal and external dependencies, including dependencies on other Meta products that may not themselves be in scope for removal. To address this, our Systematic Code and Asset Removal Framework (SCARF) includes a workflow management tool that saves engineers time by identifying dependencies as well as the correct order of tasks for cleaning up a product. In addition, SCARF includes subsystems for safely removing dead code as well as unused data types.

SCARF powers thousands of human-led deprecation projects alongside the millions of code and data assets it has cleaned automatically. It is additionally useful for our privacy teams, who use the tool to monitor progress of ongoing product deprecations and ensure that they are completed in a timely manner.

04. ONGOING COMMITMENT TO PRIVACY

We're invested in privacy and are committed to continuous improvement.

"Getting privacy right is a continual, collective investment across our company, and is the responsibility of everyone at Meta to advance our mission." Michel Protti, Chief Privacy Officer for Product

Protecting users' data and privacy is essential to our business and our vision for the future. To do so, we're continually refining and improving our privacy programme and our products, as we respond to evolving expectations and technological developments – working with policy makers and data protection experts to find solutions to unprecedented challenges – and sharing our progress as we do.