A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board publishes first Annual Report


June 2022

The idea that gave birth to the Board – that social media companies should not make the defining decisions on content moderation on their own – was simple to say but complex to carry out. In our first year, we started turning this idea into reality.

Today, we are publishing our first Annual Report, which covers the period from October 2020, when we started accepting appeals, through December 2021. This describes the progress we have made in improving how Meta treats users and other affected populations around the world – and points to how much more work there is to do.

To read our Annual Report in English click here.

Our Annual Report is also available in Arabic, Chinese, French, Russian and Spanish.

Some highlights from the report are summarized below.

The Board received more than a million user appealsThe Board received more than a million user appeals

There was clearly enormous pent-up demand among Facebook and Instagram users for some way to appeal Meta’s content moderation decisions to an organization independent from the company. Users submitted more than a million appeals to the Board through December 2021, with more than 8 in 10 appeals to restore content to Facebook or Instagram concerning posts which supposedly violated Meta’s rules on bullying, hate speech, or violence and incitement.

Issued 20 decisions, taking a human rights-based approachIssued 20 decisions, taking a human rights-based approach

We learned, with our different nationalities, backgrounds, and viewpoints, how to deliberate cases with no easy answers. We issued decisions with full, public explanations on 20 significant cases in 2021, on issues ranging from hate speech to COVID-19 misinformation, overturning Meta’s decisions 14 times. We took a human rights-based approach to analyzing content moderation decisions and received nearly 10,000 public comments that helped to shape our first judgments. We also asked Meta more than 300 questions as part of our first 20 cases, opening a transparent space for dialogue with the company which did not exist before. In many more cases, the Board’s work resulted in a voluntary decision by the company to reverse wrongful content moderation decisions.

Saw our recommendations having a growing impact on usersSaw our recommendations having a growing impact on users

We also made 86 recommendations to Meta in 2021 that pushed the company to be more transparent about its policies. Meta’s responses to our case decisions and policy recommendations are starting to improve how it treats users:

  • Meta now gives people using Facebook in English who break its rules on hate speech more detail on what they’ve done wrong.
  • The company is rolling out new messaging in certain locations telling people whether automation or human review resulted in their content being removed, and has committed to provide new information on government requests and its newsworthiness allowance in its transparency reporting.
  • Meta translated Facebook’s Community Standards into Punjabi and Urdu, and committed to translate the platform’s rules into Marathi, Telugu, Tamil and Gujarati. Once completed, more than 400 million more people will be able to read Facebook’s rules in their native language.

Took a new data-driven approach to implementationTook a new data-driven approach to implementation

While Meta committed to implement most of the recommendations we made in 2021, our next task is to ensure that the company turns its promises into actions that will improve the experience of people using Facebook and Instagram. As such, our Annual Report applies a new, data-driven approach to track how the company is implementing each of our recommendations. This shows that that for two-thirds of the 86 recommendations we made in 2021, Meta either demonstrated implementation or reported progress. We are also seeking new data from Meta to allow us to understand the precise impact of our proposals on users.

Plan to expand our work in 2022 and beyondPlan to expand our work in 2022 and beyond

The Board has begun to build the foundations of a project that can successfully drive change within Meta. In 2022, we will build on this strong start. Last month we added three new Board Members, and we are in dialogue with Meta about expanding our scope, including to review user appeals of its decisions in areas like groups and accounts. We are also expanding our stakeholder outreach in Asia, Latin America, the Middle East, and Africa. As we continue our work, the Board will be part of a collective effort by companies, governments, academia, and civil society to shape a brighter, safer, digital future that will benefit people everywhere.

What’s nextWhat’s next

As part of our commitment to transparency, we also publish quarterly transparency reports throughout the year. Today we are publishing our Q4 2021 transparency report alongside our Annual Report. We will also be releasing our Q1 2022 quarterly transparency report in the coming weeks.

Attachments

Annual Report - English
Download
Annual Report - Arabic
Download
Annual Report - Chinese
Download
Annual Report - French
Download
Annual Report - Russian
Download
Annual Report - Spanish
Download
Q4 2021 Quarterly Transparency Report
Download
Back to news and articles