A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Q1 2023 transparency report: Board publishes new data on the impact of its recommendations


June 2023

Today, the Oversight Board published its transparency report for Q1 2023. Alongside an overview of the Board’s activities in this quarter, this also includes new data on the impact of recommendations from our “Iran protest slogan” and “breast cancer symptoms and nudity” cases.

Documenting our impact on users Documenting our impact on users

Reshaping Meta’s approach to content moderation is a long-term process. In the two-and-a-half years since we made our first recommendations, we have seen Meta make new commitments that have led to changes in its rules and enforcement. Many of these were included in our 2022 Annual Report.

The next step is to document how these changes are improving the experiences of Facebook and Instagram users around the world. To do this, Meta is providing us with data on the impact of specific recommendations. Our Implementation Team is also using tools such as CrowdTangle to document the effects of changes Meta has made due to our decisions and recommendations. Today, we are publishing two further data points which illustrate our impact.

Empowering Iranian protesters on Instagram Empowering Iranian protesters on Instagram

In our “Iran protest slogan” decision, we urged Meta to better protect political speech in Iran, where historic protests have been violently suppressed. In response, Meta announced it would allow the term “Marg bar Khamenei” (which literally translates as “Death to [Iran’s supreme leader] Khamenei”) to be shared in the context of ongoing protests in Iran and reversed previous enforcement actions against this kind of content.

We compared posts using the phrase “Marg bar Khamenei” from the same set of public pages, groups, and Instagram accounts during the same time period before and after Meta implemented our recommendation. We found that, after Meta implemented our recommendation in January 2023, posts using the phrase “Marg bar Khamenei” from these Instagram pages, groups and accounts increased by nearly 30%. Our statistical analysis demonstrated that this increase was unlikely to be solely attributed to random variation, and highly likely to be caused by Meta implementing our recommendation.

Further improvements in how Meta identifies posts raising awareness of breast cancer Further improvements in how Meta identifies posts raising awareness of breast cancer

We also received further implementation information about the recommendation from our “breast cancer symptoms and nudity” decision urging Meta to, “improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer are not wrongly flagged for review.”

Meta previously shared that improvements to its text-overlay detection, made following our recommendation, led to 2,500 pieces of content being sent to human review over just a 30 day period in February and March 2023, when it would previously have been automatically removed.

In addition to these changes, Meta has now told us that it tested and deployed a new health content classifier to further enhance Instagram’s techniques for identifying image-based content about breast cancer. These enhancements have been in place since July 2021, and in the 28 days from March 21 to April 18, 2023, they contributed to an additional 1,000 pieces of content being sent for human review that would have previously been removed without review.

This new data further demonstrates our impact on how Meta moderates content raising awareness of breast cancer – including systemic changes and new tools – and shows the different ways in which Meta is responding to this important recommendation.

The Oversight Board in Q1 2023 The Oversight Board in Q1 2023

Our Q1 2023 transparency report also contains an overview of the Board’s activities and interactions with Meta during this period.

Some highlights are listed below:

  • We published decisions on four cases in Q1 2023: “Iran protest slogan,” “gender identity and nudity” (which covered two cases), and “Sri Lanka pharmaceuticals.”
  • Stakeholders submitted nearly 300 public comments to the Board ahead of the deliberations on these cases.
  • Meta answered 53 out of 56 of the questions we asked while reviewing these cases, with three partially answered. At 95%, this was the highest share of questions that Meta has answered since we started recording this data point in Q1 2021.
  • We also announced four new cases for consideration, as well as a policy advisory opinion on the Arabic term “Shaheed.” This accounts for more content removals under the Community Standards than any other single word or phrase on Meta’s platforms.
  • Facebook and Instagram users submitted more than 140,000 cases to the Board in Q1 2023.
  • The vast majority (81%) of user appeals to the Board in Q1 2023 concerned content shared on Facebook, while 19% of cases concerned content shared on Instagram. 19% represents the highest-ever share of appeals to the Board about content shared on Instagram – which represented only 1% of appeals in Q1 2022.
  • Following our concerns about Meta’s opaque penalty system and user concerns about being placed in “Facebook jail,” the company changed its ‘strikes’ system to make it fairer and more transparent.

Meta responds to our recommendations on removing COVID-19 misinformationMeta responds to our recommendations on removing COVID-19 misinformation

We reviewed Meta’s COVID misinformation policies earlier this year and recently Meta responded to our recommendations. Of the 17 recommendations we made in our review that still apply, Meta committed to fully or partially implement over three fourths of them.

Our review of Meta’s COVID policies was a deep-dive into their systems, policies, and past experience handling mis- and disinformation on their platforms. As we have seen from Meta’s recent response to our recommendations, our review will have broad impact on Meta’s handling of this kind of content on moving forward.

Just to highlight a few noteworthy changes Meta has made or has committed to because of our review:

  • Meta has committed to assess the feasibility of a localized approach to COVID-19 misinformation, including that it will continue removing COVID-19 misinformation in countries that still consider COVID-19 to be a public health emergency now that the World Health Organization (WHO) has lifted the global health emergency designation. This is meaningful because Meta has been strongly committed to one global policy until now for this and other issues.
  • Meta said it will clarify its “Misinformation about health during public health emergencies” policy by explaining that the requirement that information be “false” refers to false information according to the best available evidence at the time the policy was most recently re-evaluated. Misinformation researchers have long advocated for this type of policy to allow room for good-faith enforcement even as facts are evolving.
  • Access tools such as CrowdTangle and the Researcher API previously known as FORT have been a major sticking point with Meta over the last few months. Meta has committed to maintain the level of data access provided to researchers and academics, and to provide future expanded access not only to researchers covered by the Digital Services Act, but also to researchers in Global Majority countries.

Meta acknowledged that tools like CrowdTangle and the Researcher API provide “significant support to the research community, and that they help bring transparency to Meta’s policy decisions while improving societal understanding of complex issues.” The Board welcomes Meta’s commitment to transparency and looks forward to seeing that commitment in action.

Briefing on Meta’s decision to restore former President TrumpBriefing on Meta’s decision to restore former President Trump

Following Meta’s decision in January of 2023 to restore former President Trump to Facebook and Instagram, the Oversight Board asked for a briefing on Meta’s rationale behind the decision. In response, Meta briefed the Board on the company’s decision-making process. The Board believes the confidential briefing Meta provided demonstrated a principled approach to decision making and showed that the company did in fact broadly follow our recommendations in this case. We applaud the process and the transparency they provided the Board. However, as an organization that believes in transparency, we call on Meta to be more transparent publicly about its decision-making in cases like this moving forward.

What’s next What’s next

As part of our commitment to transparency, we will continue to publish transparency reports on a quarterly basis. These will include new data about the impact of our recommendations on users.

Attachments

Q1 2023 quarterly transparency report
Download
Table of recommendations
Download
Back to news and articles