Skip to content

We have a responsibility to protect people’s privacy and give them control to make their own choices.

We believe in being transparent about how we are approaching privacy as a company.

Continuing to Invest

We’ve previously shared our commitment to changing our privacy approach and investing in efforts to ensure we protect people’s privacy. To demonstrate our dedication to building towards the future responsibly, we continue to make progress on our work to build a stronger privacy foundation at Meta. We do this by scaling processes and technical mechanisms that drive accountability and work to ensure our community trusts that our products use their data responsibly. This work builds on our existing efforts to comply with global privacy/data protection laws.

Holding Ourselves Accountable

Since committing to making these changes, one thing we’ve heard is that being accountable also means being transparent about our privacy efforts. We want to continue to provide you with a closer look at the work we’re doing to embed privacy across our company operations, the outcomes that this work is driving and the technical solutions we’re investing in to address privacy at scale. We hope this information will both help our community understand the work we’re doing to protect privacy and enable dialogue about the approach we’ll take from here.

Photo of Meta CEO Mark Zuckerberg addressing employees at an outdoor town hall

“Privacy at Meta involves all of us. We’re bringing teams across the company together to further strengthen our privacy program and continue developing innovative technical solutions to support our long-term efforts.”

- Michel Protti, Chief Privacy Officer for Product


Our privacy work is a journey that will never end. We’re committed to continually refining and improving our privacy program as we respond to evolving expectations and technological developments. We’ll continue to share our progress as we improve and evolve our program.


We’ve designed a governance framework to foster accountability for privacy at every level of our company.

Photo of Chief Privacy Officer of Product Michel Protti on Meta’s Menlo Park campus

“We’ve made important progress, but we still have a tremendous amount of work to do. We’re in the early phases of a multi-year and ongoing effort to evolve our culture, our operations and our technical systems to honor people’s privacy.”

- Michel Protti, Chief Privacy Officer for Product

We have designed our privacy program to scale and evolve over time. It includes a governance structure, privacy training and education that provide the foundation to enable privacy accountability across the company.


Privacy is everyone’s responsibility at Meta -- from our CEO and executives, to engineers and sales teams across the globe, and everyone across the company, we are all responsible for privacy. As a result, we have a cross-functional group of stakeholders across the company who provide engineering, legal, policy, compliance, and product expertise that enable the design and implementation of our privacy program.

Led by Chief Privacy Officer, Product, Michel Protti, the Meta Privacy and Data Practices team is made up of dozens of teams, both technical and non-technical, focused on privacy and responsible data practices.

The Meta Privacy and Data Practices team is at the center of our company’s efforts to build a comprehensive privacy program. Its mission –to instill responsible data practices across Meta– guides this work by ensuring people understand how Meta uses their data and trust that our products use their data responsibility.

But the Meta Privacy and Data Practices team is just one organization among many across the company that is responsible for privacy. There are thousands of people in different organizations and roles across Meta, including public policy, privacy strategy, and legal, to name a few, who are working to embed privacy into all facets of our company operations. Getting privacy right is a deeply cross-functional effort, and we believe everyone at Meta is responsible for that effort.

To integrate privacy more deeply across the company, we're establishing embedded privacy teams within product groups that will deepen the understanding of privacy considerations by providing expertise within each product and business group. These teams will take on front-line ownership of compliance across our products.

Led by Erin Egan, Vice President & Chief Privacy Officer, Policy, the Privacy & Data Policy team leads our engagement in the global public discussion around privacy, including new regulatory frameworks, and ensures that feedback from governments and experts around the world is considered in our product design and data use practices, including during the course of our privacy review process. The Privacy & Data Policy team consults with experts outside the company and leverages their insights as part of our privacy decisions through a variety of consultation mechanisms, including:

  • Privacy advisory groups that are in region experts or that have specialized expertise on topics like ads or XR
  • Data dialogues and other convenings on complex privacy questions
  • Ongoing consultations with experts as a part of our product development process, and to inform decisions we make in privacy review
  • Funding experts' research and advocacy around privacy, including our privacy research and XR responsibility grants
  • Funding experts to do "deep dive" consultations with us on how we can improve aspects of our privacy program

We know privacy is a complex topic and we don’t have all the answers, so we will continue to work with experts inside and outside the company to develop best practices and contribute to the development of comprehensive privacy regulation.

The Privacy Legal team is embedded in the design and ongoing execution of our program, as well as tasked with providing counsel on what is legally required during the course of our privacy review process.

The Privacy Committee is an independent committee of Meta’s Board of Directors that meets quarterly to ensure we live up to our privacy commitments. The Committee is made up of independent directors with a wealth of experience serving in similar oversight roles.

They receive regular briefings on the state of our privacy program and compliance with our FTC Order from our independent privacy assessor, whose job it is to review and report on our privacy program on an ongoing basis.

Internal Audit brings independent assurance on the overall health of our privacy program and the supporting control framework.

Privacy Education

Our goal is to make privacy a core responsibility for every employee at Meta. Part of this is driving continuous privacy learning and education that spans training, internal awareness campaigns, and a regular speakers series.

A foundational component of our privacy education approach is delivered through our New Hire and Annual Privacy training, which covers the core elements of privacy, including training on Meta’s Privacy Program, which is designed to help every employee develop the ability to identify and mitigate privacy risks, and understand how to work with the Privacy Program day-to-day. Through its eLearning format, both our Annual Privacy training and New Hire Privacy training courses provide scenario-based examples of privacy considerations aligned with Meta’s business operations. The trainings include an assessment to test employees' understanding of the relevant privacy concepts.

In addition to our foundational mandatory training deployed to all personnel at Meta, we also maintain a course catalog of additional training that is relevant to people in specific roles at the company. The Privacy Training program will continue to invest in this portfolio as continuous learning opportunities across privacy and data topics are a critical component to instilling responsible data practices at Meta.

Another way we drive privacy education is through regular communication to employees. In addition to our Annual Privacy training and New Hire Privacy training courses, we deliver ongoing privacy content through internal Workplace channels, updates from privacy leadership, internal Q&A sessions, a dedicated Privacy Week, and an internal hub of on-demand privacy content to help guide decisions and processes.

And when we participate in external privacy events like Data Privacy Day, we drive internal awareness and engagement through internal channels to ensure everyone has an opportunity to participate.

Privacy Week drives cross-company focus on privacy, features internal and external speakers, and highlights key privacy concepts and priorities through engaging content and events that occur throughout the week.

Many of the privacy questions we confront don’t have easy or well-defined answers, and the best way to begin tackling those hard problems is by hearing from experts outside the company. In addition to work that we do to obtain input from experts during our privacy review process, we also host outside experts to speak to our entire team about their work and perspective on privacy at Meta. It’s an opportunity for our entire privacy team to hear from a variety of privacy experts on important and complex topics on a regular basis.


We’re scaling how we operationalize privacy, including how we build new products

We’ve made progress on our work to give people more control over their privacy, and our broader mission to honor people’s privacy in everything we do. We’ve done so by building processes, products and technical mechanisms that have laid the foundation for privacy and accountability across the company.

Photo of two people speaking at an internal roundtable

In order to put our accountability foundation into practice, we have designed processes, escalation paths and technical mechanisms that embed privacy across all facets of our company operations.

Risk assessments are essential to our ability to identify, assess, and mitigate privacy risks. We have designed a privacy risk assessment program that performs an annual assessment to identify and assess privacy risk across the company, as well as a process to assess privacy risk after certain incidents. We will continue to evolve and mature our privacy risk assessment process.

We have designed safeguards—operational activities, policies, and technical systems —to address privacy risk and meet privacy expectations and regulatory obligations.

The Privacy Review process is a central part of developing new and updated products and services at Meta. Through this process, we assess how data will be used and protected as a part of new or updated products and services. We work to identify privacy risks that involve the collection, use or sharing of personal information and develop mitigations for those risks. The goal of this process is to maximize the benefits of our products and services for our community, while also working upfront to identify and reduce any risks.

The development of our new or modified products, services or practices is guided by our internal privacy expectations, which include:

  1. Purpose Limitation: Process data only for a limited, clearly stated purpose that provides value to people.
  2. Data Minimization: Collect and create the minimum amount of data required to support clearly stated purposes.
  3. Data Retention: Keep data for only as long as it is actually required to support clearly stated purposes.
  4. External Data Misuse: Protect data from abuse, accidental loss and access by unauthorized third parties.
  5. Transparency and Control: Communicate product behavior and data practices proactively, clearly and honestly. Whenever possible and appropriate, give people control over our practices.
  6. Data Access and Management: Provide people the ability to access and manage the data that we have collected or created about them.
  7. Fairness: Build products that identify and mitigate risk for vulnerable populations, and ensure value is created for people.
  8. Accountability: Maintain internal process and technical controls across our decisions, products and practices.

Privacy Review is a deeply collaborative, cross-functional process used to evaluate and comply with our obligations, and identify and mitigate privacy risks. It is led by our Privacy Review team, and is conducted by a dedicated group of internal privacy experts across legal, policy, and other cross functional teams with backgrounds in product, engineering, legal regulations, security and policy. This group is responsible for making Privacy Review decisions and recommendations.

As a part of the process, the cross-functional team evaluates privacy risks associated with the project and determines if there are any changes that need to happen before project launch to control for those risks. If there’s no agreement between the members of the cross-functional team on what needs to happen, the team escalates to a central leadership review, and further to the CEO, if needed for resolution.

We continue to incorporate technical requirements and utilize tools to enhance accountability and operate the Privacy Review process at scale.

We developed a centralized tool that is used throughout the Privacy Review lifecycle for a project. It enables teams to manage all aspects of their Privacy Review submissions, including to search and manage historical and new privacy commitments that we have made as a company.

We will continue leveraging these tools to support future investment in infrastructure improvements that will systematize the process of enforcing our privacy decisions. These changes will integrate more automated processes into Privacy Review, which will make it easier to consistently enforce our privacy commitments across our products and services.

In addition to developing a centralized tool for the process, we have designed a technical implementation review to analyze, verify, and document the technical implementation of privacy mitigations and commitments prior to product launch.

This process, integrated with the tools we use to build software at Meta, enables us to verify that what is agreed to in the documented decision is in fact what is implemented.

No matter how robust our mitigations and safeguards, we also need a process to identify when an event potentially undermines the confidentiality, integrity, or availability of data for which Meta is responsible, investigate those situations, and take any needed steps to address gaps we identify. Our Incident Management program operates globally to oversee the processes by which we identify, assess, mitigate, and remediate privacy incidents. Although the privacy team leads the incident management process, privacy incidents are everyone’s responsibility at Meta, with teams from across the company, including legal, policy, and product teams, playing vital roles. We continue to invest time, resources, and energy in building a multi-layered program that is constantly evolving and improving. Although each layer plays an important role, below we highlight three components that reflect our approach.

We take a layered approach to protecting people and their information - implementing many safeguards to catch bugs. Given the scale at which Meta operates, we have invested heavily in building and deploying a wide range of automated tools that are intended to help us identify and remediate potential privacy issues as early and quickly as possible. Issues detected through these automated systems are flagged in real time to facilitate rapid response, and in some cases, can be self-remediated.

Of course, no matter how capable our automated systems become, the oversight and diligence of our employees always plays a critical role in helping to proactively identify and remediate incidents. Our engineering teams are constantly reviewing our systems to identify and fix issues before they can impact people.

Since 2011, we have operated a bug bounty program in which external researchers help improve the security and privacy of our products and systems by reporting potential security vulnerabilities to us. The program helps us scale detection efforts and fix issues faster to better protect our community, and the rewards we pay to qualifying participants encourage more high-quality security research.

Over the past 10 years, more than 50,000 researchers joined this program and around 1,500 researchers from 107 countries have been awarded bounties. A number of them have since joined Meta’s security and engineering teams and continue this work protecting the Meta community.

While we’ve adopted a number of protections to guard against privacy incidents like unauthorized access to data, if an issue does occur, we believe that transparency is an important way to rebuild trust in our products, services, and processes. Accordingly, beyond fixing and learning from our mistakes, our Incident Management program includes steps to notify people where appropriate, such as a post in our Newsroom or our Privacy Matters blog about issues impacting our community, or working with law enforcement or other officials to address issues we find.

Third parties are external partners who do business with Meta but aren’t owned or operated by Meta. These third parties typically fall into two major categories: those who provide a service for Meta (like vendors who provide creative support) and those who build their businesses around our platform (like app or API developers). To mitigate privacy risks posed by third parties that receive access to personal information, we developed a dedicated third party oversight and management program, which is responsible for overseeing third party risks and implementing appropriate privacy safeguards.

We have developed a third party privacy assessment process for service providers to assess and mitigate privacy risk at Meta. These service providers are also bound by contracts containing privacy protections. Their risk profile may determine how they are monitored and reassessed and, where appropriate, which enforcement actions to take as a result of violations, including termination of the engagement.

We have designed a formal process for enforcing and offboarding third parties who violate their privacy or security obligation.

To support this, we have developed procedures and infrastructure designed to ensure that third party developers complete an annual Data Use Checkup (DUC), in which developers certify to the purpose for and use of each type of personal information that they request or continue to have access to, and that each purpose and use complies with Meta’s Platform Terms and Developer Policies. We do not take these obligations for granted.

We have also developed technical and administrative mechanisms to monitor developers’ compliance with our Platform Terms on both an ongoing and periodic basis. When we detect a violation, we take into account the severity, nature and impact of the violation, the developer’s malicious conduct or history of violation, and applicable law when determining the appropriate enforcement action to take.

We have also developed data security standards based on principles for developers to drive better security practices across our platform and the developer ecosystem more broadly. As an example, we launched the Platform Initiatives Hub for developers to ensure that they have the tools and information they need to continue to use our platform responsibly.

We worked across Meta to launch the first version of the Developer Trust Center, a central hub on the Meta for Developers site bringing together material for third party developers on data privacy, data security, Platform Terms, and monitoring mechanisms that they interact with such as App Review, App Re-Review, DUC, and the new Data Protection Assessment (DPA).

Scraping is the automated collection of data from a website or app and can be either authorized or unauthorized. Using automation to access or collect data from Meta’s platforms without our permission is a violation of our terms.

Our External Data Misuse team consists of more than 100 people dedicated to detecting, investigating and blocking patterns of behavior associated with unauthorized access to data. Below, we highlight some of the ways we put this into practice.

To help people understand how we work to guard against scraping, we share ongoing updates around actions we’ve taken to protect against data misuse across our platforms and share ways people can best protect their data.

We have invested in infrastructure and tools to make it harder for scrapers to acquire data from our services and more difficult to capitalize off of it if they do. Examples of these investments include rate limits and data limits. Rate limits cap the number of times anyone can interact with our products in a given amount of time, while data limits keep people from getting more data than they should need to use our products normally.

We have blocked billions of suspected scraping actions per day across Facebook and Instagram. We have taken a variety of actions against unauthorized scrapers including disabling accounts and requesting that companies hosting scraped data take that data down. In 2020 and 2021 we took over 300 enforcement actions against unauthorized scraping.

Going forward, we plan to continue to publish more about our approach to scraping as well as continuing our ongoing updates on actions we’re taking to address scraping.


We strive to design products and features with privacy in mind.

Another photo of Meta CEO Mark Zuckerberg addressing employees at an outdoor town hall

“This is going to be a major turning point for our company, and it is going to require all of your help and work to help deliver on this for the people we serve. We have a responsibility to protect people’s privacy.”

- Mark Zuckerberg, Chief Executive Officer

The accountability processes, safeguards and technical mechanisms that we’ve built help ensure that new products and features embed privacy by design. We’ve seen these updated processes enable us to improve our privacy approach in new products and features, as we pivot to respond to the world around us.

Our work to communicate transparently includes providing education to improve understanding and awareness of our practices and ensuring information is accessible and easy to find.

There are many ways we communicate data privacy practices to users. In some instances we communicate with users through a dedicated privacy section of our Newsroom, where we provide more information about how we’ve approached privacy in the context of particular features or issues. In others, we provide users with in-product notices or contextual education about our privacy controls to help them understand our data processing activities and how we use their information to inform their experience. Most recently, we introduced the Privacy Center, a central hub where users can better understand our practices and make informed decisions about their privacy in a way that is right for them.

We also communicate this information through our Privacy Policy, which describes how Meta collects, processes, uses, stores, and shares personal information. Our Privacy Policy was recently updated to ensure we more clearly state our data practices at an accessible reading level, offer greater transparency, and more specifically, narrate our current practices, including providing real-world examples for users. By making these changes to the policy, we aim to facilitate more user engagement, including incorporating audiovisual resources, in-context access to relevant settings, and providing the tools users need to make informed privacy decisions and exercise their data privacy rights.

Privacy Center is a place for people to better understand our practices so they can make informed decisions about their privacy in a way that is right for them. Through education and access to privacy and security controls, we address some of the most common privacy concerns from the billions of people who spend their time with us everyday.

Privacy Center has several modules including sharing, collection, use, security, and ads to directly connect an issue or concern with the relevant privacy and security controls we’ve built across our apps and services over the years. It can be found by navigating to Settings and Privacy on the mobile web and desktop versions of Facebook or by visiting our Privacy and Security site. We’ll continue to update Privacy Center and add more modules and controls to help people understand our approach to privacy across our apps and technologies.

To provide greater transparency and control to people, we’ve developed a number of privacy tools, like Privacy Checkup, which guides people through important privacy and security settings on Facebook to help them strengthen account security and manage who can see what they share and how their information is used, and Off Facebook Activity, which provides a summary of activity that businesses and organizations share with us about people’s interactions, such as visiting their apps or websites. Off Facebook Activity also gives people the option to disconnect their past activity from their account.

We are continuously working to improve many of these tools to provide greater transparency and control to people.

We launched Access Your Information in 2018 to give people a central place to access their information on Facebook. Since then, we’ve worked to improve transparency and usability for people by reorganizing data categories into more granular and easy to understand subcategories, such as ‘Ads information’, ‘Connections’, ‘Apps and Websites off of Facebook’.

We also added search functionality, so people can find data categories more easily. We also added information about how your data may be used to personalize your experience on Facebook. We made these updates to connect people to meaningful, enjoyable and relevant experiences and make it easier for people to access and understand their information.

In 2020, we launched improvements to Activity Log as a transparency and control tool to help people archive or delete their old posts in one, centralized place. We received feedback from privacy experts and people about the limitations around managing old posts, photos and other content in bulk. So we created an archival control for content you no longer want others to see on Facebook, but may want to keep for yourself.

Manage Activity allows you manage posts in bulk with filters to help you sort and find the content you are looking for, like posts with specific people or from a specific date range.

It also includes a deletion control that provides a more permanent option, so people can move old posts in bulk to the trash. After 30 days, posts sent to the trash will be deleted, unless you choose to manually delete or restore them before then.

We introduced the option to use disappearing messages on WhatsApp and Messenger. When the option for disappearing messages is turned on, new messages sent to a chat are designed to disappear from the chat after a number of days, helping the conversation feel lighter and more private. In a one-to-one chat, either person can turn disappearing messages on or off. In groups, admins will have the control.

We started with seven days because we think it offers peace of mind that conversations aren’t permanent, while remaining practical so you don’t forget what you were chatting about. The shopping list or store address you received a few days ago will be there while you need it, and then disappear after you don’t.

As part of Meta’s vision for a privacy-focused platform, we believe people’s personal, private communications with other people should be secure. We care deeply about providing the ability for people to communicate privately with their friends and loved ones where they have confidence that no one else can see into their personal conversations.

We currently provide private communication through WhatsApp, Messenger and Instagram DMs. In WhatsApp, end-to-end encryption ensures only you and the person you’re communicating with can read or listen to what is sent, and nobody in between. And in Messenger and Instagram DMs, you have the option to protect your messages by end-to-end encryption just for you and the person you’re talking to.

In the past year, we have announced some new security features including end-to-end encryption for personal chats with multiple people in Messenger, and we notify people when someone takes a screenshot of their disappearing messages.

We expect future versions of Messenger, Instagram DMs, and WhatsApp to become the main ways people communicate on the Meta network. We’re focused on making all of these apps faster, simpler, and more secure for private conversations, including with end-to-end encryption. We plan to add more ways for people to interact privately with friends and groups. We also offer a number of communication platforms across Meta to bring people together more socially, in a less personal space - such as Facebook and Quest, conversations across Facebook Group members, or watching videos together on a livestream. Some of these platforms may allow for user customization for individual preferences, or require additional integrity measures to ensure compliance with our Community Standards.

With more than 2 billion users, we are excited to give people more choices to protect their privacy. People should have the right to choose to personalize their experience and we have a responsibility to our users to set a clear, thorough approach to privacy by providing people with the safest private messaging experience for messaging with friends.

Recognizing that young people have unique privacy needs, our product teams pay particular attention to youth privacy. Our goal is to provide services that respect the best interests of the young people who use them, in coordination and consultation with parents, regulators, policymakers, and civil society experts.

We employ a number of methods in our efforts to strike the right balance of giving young people the benefits of Meta products while also keeping them safe and ensuring our products generally have age-appropriate measures in place.

Some of our most recent efforts in youth privacy on Instagram include defaulting everyone who is under 16 years old (or under 18 in certain countries) into private accounts, making it harder for potentially suspicious accounts to find young people, and limiting the options advertisers have to target ads for young people. We also launched Family Center on Instagram, which is our first-ever supervision experience to help parents and guardians become more involved in their teens’ use of Instagram. For our youngest users, Messenger Kids offers an age-appropriate messaging and communications experience that gives parents controls to monitor and review aspects of their kids’ activity.

We want young people to enjoy Meta products while making sure we never compromise their privacy and safety. We’ll continue listening to young people, parents, lawmakers and other experts to build products that work for young people and are trusted by parents.

As a part of our efforts to respond to the COVID-19 pandemic, we focused on how we could share data for good while protecting people’s privacy. We developed data sets in the form of maps of populations and movement to help inform disease forecasting efforts and protective measures during the pandemic.

To protect people’s privacy, we use technical mechanisms to mitigate re-identification risk, which include:

  • Aggregating publicly available datasets that include location information in a way that protects the people’s privacy by using techniques like spatial smoothing to create weighted averages and avoid using map tiles where very few people live
  • Applying a differential privacy framework for publicly available Movement Range Maps because they are datasets on mobility, which not only contain location data but concern movement of people over time
    • Applying a differential privacy framework that takes into account the sensitivity of the data set and adds noise proportionally to ensure with high probability that no one can re-identify people and can mitigate re-identification risk even with respect to data that we are not considering at the time of design

In addition to the safeguards applied to mitigate re-identification risk, we also leverage data use agreements to stipulate clear guidelines that ensure responsible data practices.


We’re embedding privacy into the technical fabric of our company.

While we still have a lot of work left to do, we have made meaningful progress towards our goal to embed our privacy responsibilities into our systems. Our continued technical privacy investments will ensure that we can fulfill our privacy mission to honor people’s privacy in everything we do.

Photo of two people collaborating

Building privacy into our decision-making processes is an important area of continued focus. But as Meta grows, an important way of scaling privacy protections will be to build technical foundations that promote privacy and accountability at scale. We are creating sustainable technical solutions to meet evolving privacy expectations and ensure consistent application of our privacy requirements across our products and systems.

Creating advanced technical solutions to address privacy will require considerable effort. It’s a company-wide undertaking that will likely take years to fully accomplish, but we believe it’s an important investment in the future of privacy at Meta.

Building technical solutions that can adapt to evolving privacy expectations first requires we complete significant underlying technical work, including improving how we manage data across its lifecycle. One example of this is the extensive work we have done to facilitate the deletion of user data.

That their data will be properly deleted when they request it is an important privacy expectation of users of our applications and services. Users trust that when they make such a request, their data will be deleted effectively and completely.

The current approach to data deletion across industry is an onerous one, in which developers are required to manually write repetitive code that accounts for each update or change to a product and ensures that all the deletion logic still holds up. The complex architecture of modern distributed data stores also leaves room for potential error.

We have built a framework and infrastructure that helps alleviate the risk of potential error caused by developers working through an overly manual process. Engineers annotate the data being collected with the intended deletion behavior (e.g., "when a user deletes a post, also delete all the comments") and our framework and infrastructure will handle the deletions that must run across multiple data stores with reliability guarantees. Giving engineers the ability to utilize this technical infrastructure to make deletion implementation easier also helps us ensure that we address deletion early on in the product development process.

While it does not yet cover all data at Meta, we are already processing billions of deletions every day using this infrastructure.

We’re also investing in privacy enhancing technologies -- technologies based on advanced cryptographic and statistical techniques that help to minimize the data we collect, process and share. And while these technologies are an important part of our work to build a technical foundation that supports privacy, we are still in the early stages of this investment, and are continuing to explore their various use cases.

As an example, we’re doing research around our use of cryptographic techniques like blind digital signatures and anonymized logging to prevent fraud in a number of use cases, including investigating crashes, assessing performance, and monitoring product and advertising metrics. This work helps to mitigate privacy concerns by applying data minimization practices while simultaneously preventing fraud at scale.

We’re also experimenting with double-blind matching technology to enable matching of data records while preserving privacy. Much of the work in matching technologies either reveals the matched records to one or both parties or uses a complicated circuit-based construction to preserve privacy. The PETs we’re experimenting with offer a range of solutions for use cases where no personal information is shared with the other party beside the desired output.

AI powers back-end services like personalization, recommendation, and ranking that help enable a seamless, customizable experience for people who use our products and services. At Meta, we believe it’s important to empower people with tools and resources that help them to understand how AI shapes their product experiences and helps better explain how the AI operates. We are highlighting a few examples of this below.

One of the ways we are exploring increased explainability is through model and system documentation. We recently shared the next step in this journey by publishing a prototype AI System Card tool that is designed to provide insight into an AI system’s underlying architecture and help better explain how the AI operates.

This inaugural AI System Card outlines the AI models that comprise an AI system and can help enable a better understanding of how these systems operate based on an individual’s history, preferences, settings, and more. The pilot System Card we’ve developed, and continue to test, is for Instagram feed ranking, which is the process of taking as-yet-unseen posts from accounts that a person follows and then ranking them based on how likely that person is to be interested in them.

Making AI more explainable is a cross-industry, cross-disciplinary dialogue. Companies, regulators, and academics are all testing ways of better communicating how AI works through various forms of guidance and frameworks that can empower everyday people with more AI knowledge. Because AI systems are complex, it is both important and challenging to develop documentation that consistently addresses people’s need for transparency and their desire for simplicity in such explanations. As such, data sheets, model cards, System Cards, and fact sheets have different audiences in mind. We hope that a System Card can be understood by experts and non-experts, and can provide a unique in-depth view into the very complex world of AI systems to human interface in a repeatable and scalable way for Meta.

Providing a framework that is technically accurate, able to capture the nuance of how AI systems operate at Meta’s scale, and is easily digestible for everyday people using our technologies is a delicate balance, especially as we continue to push the state of the art in the field.


We’re invested in privacy and are committed to continuous improvement.

“Getting privacy right is a continual, collective investment across our company, and is the responsibility of everyone at Meta to advance our mission.” - Michel Protti, Chief Privacy Officer for Product

Privacy is one of the defining social issues of our time, and is central to Meta’s vision for the future. When we say privacy is everyone’s priority at Meta, we mean it; it’s an integral part of everything we do here from the top down to the ground up.

We are continuing to improve upon our privacy foundation. We are committed to continuing to seek feedback from and collaborate with stakeholders across industry, civil society, think tanks, academia, and our users to improve our program. Our privacy work is never finished, and we understand that this commitment means continuously improving and focusing on this every day.