Growing Our Impact Through Policy
Recommendations

As part of our case decisions and policy advisory opinions, we make recommendations on how Meta can improve its content policies, enforcement systems and overall transparency for billions of Facebook, Instagram and Threads users. The changes that Meta has made in response are already making the company more transparent and helping it to treat users more fairly. Our recommendations have led to the creation of new policies and enforcement procedures, as well as making the existing ones easier to understand. Many of our proposals echo, or build upon, calls that civil society groups and other stakeholders have been making for many years – forcing Meta to consider and respond publicly to longstanding calls for action.

So far, we have made more than 250 recommendations to Meta. The company must respond to them publicly within 60 days and update us on their implementation. To date, for around 65% of the recommendations, Meta has either fully or partially implemented them, or reported progress towards implementation.

This adoption of our recommendations – from clearer user messaging to new policy protocols – is having a real and growing impact on the way that people and communities are treated online.

Watch Our Video On The Board's Impact

Telling People Why Their Content Was Removed

We believe that giving people more information on how and why their content has been removed will build trust and improve fairness. In the past, users haven’t understood exactly which rule they violated. In response to our recommendations, the company has introduced new messaging globally, telling people the specific policy they violated within its rules on hate speech, dangerous organizations and individuals, and bullying and harassment. Meta has also noted that giving users more information about which hate speech violation led to their content being removed resulted in a statistically significant increase in how users perceive the company’s transparency and legitimacy.

Identifying Breast Cancer Context

So that posts raising awareness of breast cancer symptoms are not wrongly flagged for review, we made a recommendation urging Meta to improve the automated detection of images with text overlay. The company responded by enhancing Instagram’s identification techniques for breast cancer content. In addition to improving text overlay-based detection, Meta also launched a health content classifier to better identify image-based content on breast cancer. Between these two classifiers, thousands of posts, which previously would have been automatically removed, have instead been sent for human review. Over two 30-day periods in 2023, for example, this equated to 3,500 pieces of content being sent for human review rather than being automatically removed. These changes are helping to preserve content shared by breast cancer patients and campaigners.

Empowering Iranian Protestors

To better protect political speech in Iran, we recommended that Meta allow the phrase “Marg bar Khamenei” (“Death to [Iran’s supreme leader] Khamenei”) to be shared in the context of the protests that started in 2022 – and to reverse its removals of this type of content. By comparing posts (for the same time period across the same public Instagram pages, groups and accounts) before and after the recommendation was implemented by Meta, we saw that those containing the phrase “Death to Khamenei” increased by nearly 30%.

Making Meta’s Penalty System Fairer and Clearer

Along with many civil society groups, we have regularly raised concerns about Meta’s penalty system, reflecting the issue of users being placed in “Facebook jail.” We have consistently asked for greater transparency in this area, urging the company to reform its strikes system and explain why content has been removed to the affected users. Meta has since taken steps to reform its strikes system, providing greater transparency on it and the associated penalties. However, there remains room for improvement. In our Mention of the Taliban in News Reporting decision, we urged greater transparency around the most “severe strikes” because mistaken enforcement of serious violations raises significant concerns for journalists and activists in particular.

Other Areas of Board Impact

  • Translation of Facebook’s Community Standards into more than 20 additional languages, spoken by hundreds of millions of people. 
  • Completion of the global rollout of new messaging telling people whether human or automated review has led to their content being removed.
  • Telling people when access to content has been restricted following a government request.
  • Creation of a new Community Standard on misinformation.
  • Introduction of a Crisis Policy Protocol. 
  • Setting up of a new crisis coordination team to provide dedicated oversight of operations during imminent and emerging crises.