A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board publishes transparency report for third quarter of 2021


December 2021

On October 28, 2021, Facebook announced that it was changing its company name to Meta. In this text, Meta refers to the company, and Facebook continues to refer to the product and policies attached to the specific app.

Today we are issuing a transparency report for the third quarter of 2021.

As part of the Oversight Board’s commitment to transparency about our work, each quarter we share details on the cases that we are receiving from users, the decisions we have taken and our recommendations to Meta. Our first transparency reports, covering Q4 2020, Q1 and Q2 2021, were released in October 2021.

Today’s report shows that in Q3 2021 (July 1 to September 30, 2021), the Board continued to receive a growing volume of appeals from Facebook and Instagram users around the world, with around 339,000 cases submitted in this quarter.

The Board published six decisions during Q3, covering issues ranging from political protests to COVID-19 lockdowns. In four of these decisions, the Board overturned Meta’s decision on the content, while we upheld the company’s decision twice.

As part of our process for producing these decisions, we received over 100 public comments. In these six decisions, we issued 25 policy recommendations, of which Meta committed to implement the majority.

Highlights from the Board’s Q3 transparency reportHighlights from the Board’s Q3 transparency report

1. Number of cases submitted to the Board rises significantly

From July to September, we estimate that users submitted around 339,000 cases to the Board. This represents an increase of 64% over the 207,000 cases submitted in the second quarter of 2021.

Improvements to how users appeal to the Board through Facebook’s mobile app as well as the growing profile of the Board may have contributed to this increase. Since the Board started accepting cases last October, we have now received more than 860,000 requests from users to examine Meta’s content moderation decisions.

2. Increase in cases related to violence and incitement

From July to September, users who wanted their content restored mainly submitted cases concerning Facebook’s rules on bullying and harassment (34%), violence and incitement (30%), and hate speech (23%). The share of cases related to its rules on violence and incitement increased by two-thirds between the second and third quarters of this year, rising from around 18% to around 30%.

3. In around half of the cases shortlisted by the Board in Q3 2021, Meta decided its original decision to remove the content was incorrect

After the Board’s Case Selection Committee identifies cases for Board review, we ask Meta to confirm that they are eligible for review under the Bylaws. At this stage, Meta sometimes realizes that its original decision to remove the content was incorrect. This happened in about half of the cases shortlisted by the Case Selection Committee in Q3 2021 (13 out of 28 cases). Meta restored these pieces of content to Facebook or Instagram.

Sometimes, despite Meta’s change of position, the Board goes on to issue a full decision when the case has significant precedential value. This happened in two cases published in Q3 2021 ( Ocalan’s isolation and Shared Al-Jazeera Post). Without the Board taking action, Meta’s incorrect decisions would likely remain in effect.

4. Meta committed to implement either “fully” or “in part” more than half of the Board’s recommendations in Q3 2021

Of the 25 recommendations the Board made in its six decisions published from July to September, Meta said it was implementing nine recommendations “fully” and four recommendations “in part.” The company said it was “assessing feasibility” on five recommendations and claimed a further five recommendations represented “work Meta already does.” Meta said it would take “no further action” on two recommendations.

In response to recommendations the Board made in Q3, Meta made several commitments which could increase transparency for users.

For example, Meta agreed to provide information in its Transparency Center on content removed for violating Facebook’s Community Standards following a formal report by a government, including the number of requests it receives.

Meta also said it would "fully implement" our recommendation for an independent entity, not associated with either side of the Israel-Palestinian conflict, to conduct a thorough examination into whether its content moderation in Hebrew and Arabic – including its use of automation – has been applied without bias.

Steering Meta towards greater transparencySteering Meta towards greater transparency

In addition to issuing our Q3 transparency report, today we have also published two new case decisions which call for more transparency around Meta’s policies and how they are enforced.

In the decision about ayahuasca, a psychoactive brew with spiritual uses, we repeated our call for Meta to explain to users that it enforces the Facebook Community Standards on Instagram, with several specific exceptions.

In our decision about a wampum belt, a North American Indigenous art form, we asked Meta to assess how accurately its reviewers enforce exceptions on artistic expression and expression about human rights violations. It should also look at how a reviewer’s location impacts their ability to accurately assess 'counter speech,’ where hate speech is referenced to resist oppression and discrimination.

As Meta responds to these recommendations, and others, we will be monitoring whether and how the company lives up to its promises.

Over time, we believe that the combined impact of our recommendations will push Meta to be more transparent and benefit users.

As we hold Meta to account, our independence is crucial. Our trustees play a key role in protecting the Board’s independence, and today we’re delighted to announce Stephen Neal, Chairman Emeritus and Senior Counsel of law firm Cooley, and Chairman of the Hewlett Foundation, as a new trustee.

Update on case 2021-017-FB-UAUpdate on case 2021-017-FB-UA

Last month, we announced three new cases, including one which discussed the Taliban takeover in Afghanistan. Due to user action, this case is no longer available for review by the Board. The user who appealed the case to the Board has deleted their post which has now been removed from the platform.

As Facebook users have the right to delete their content, we anticipate that events like this will happen from time to time. Where user action results in a case being withdrawn, we will announce this promptly.

What nextWhat next

As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. We will also issue an annual report next year which assesses Meta’s performance in implementing our decisions and recommendations.

Attachments

Q3 Transparency Report
Download
Back to news and articles