Overturned

Journalist Recounting Meeting in Gaza

A journalist appealed Meta’s decision to remove a Facebook post recounting his personal experience of interviewing Abdel Aziz Al-Rantisi, a co-founder of Hamas. This case highlights a recurring issue in the over-enforcement of the company’s Dangerous Organizations and Individuals policy, specifically regarding neutral posts. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Type of Decision

Summary

Policies and Topics

Topic
Freedom of expression, News events, War and conflict
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Palestinian Territories, Spain

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention andinclude information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Case Summary

A journalist appealed Meta’s decision to remove a Facebook post recounting his personal experience of interviewing Abdel Aziz Al-Rantisi, a co-founder of Hamas. This case highlights a recurring issue in the over-enforcement of the company’s Dangerous Organizations and Individuals policy, specifically regarding neutral posts. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

Soon after the October 7, 2023, terrorist attacks on Israel, a journalist posted on Facebook their written recollection of conducting an interview with Abdel Aziz Al-Rantisi, a co-founder of Hamas, which is a designated Tier 1 organization under Meta’s Dangerous Organizations and Individuals policy. The post describes the journalist’s trip to Gaza, their encounters with Hamas members and local residents, as well as the experience of finding and interviewing al-Rantisi. The post contains four photographs, including of Al-Rantisi, the interviewer and masked Hamas militants.

In their appeal to the Board, the user clarified that the intention of the post was to inform the public about their experience in Gaza and interview with one of the original Hamas founders.

Meta removed the post from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company removes from its platforms certain content about individuals and organizations it designates as dangerous. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals in the context of social and political discourse. This includes content reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities.”

After the Board brought this case to Meta’s attention, the company determined the “content aimed to increase the situational awareness” and therefore did not violate the Dangerous Organizations and Individuals Community Standard. Meta cited the social and political discourse allowance in the context of “neutral and informative descriptions of Dangerous Organizations and Individuals activity or behavior.” Furthermore, Meta said that “the social and political discourse context is explicitly mentioned in the content so there is no ambiguity [about] the intent of the user” in this case.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case Significance

The case highlights over-enforcement of Meta’s Dangerous Organizations and Individuals policy, specifically news reporting on entities the company designates as dangerous. This is a recurring problem, which has been particularly frequent during the Israel-Hamas conflict, in which one of the parties is a designated organization. The Board has issued several recommendations relating to the news reporting allowance under the Dangerous Organizations and Individuals policy. Continued errors in applying this important allowance can significantly limit users’ free expression, the public’s access to information and impair public discourse.

In a previous decision, the Board recommended that Meta “add criteria and illustrative examples to Meta’s Dangerous Organizations and Individuals policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” ( Shared Al Jazeera Post, recommendation no. 1). Meta reported implementation of this recommendation demonstrated through published information. In an update to the Dangerous Organizations and Individuals policy dated December 29, 2023, Meta has modified its explanations and now uses the term “glorification” instead of “praise” in its Community Standard.

Furthermore, the Board has recommended that Meta should “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors,” ( Mention of Taliban in News Reporting, recommendation no. 5). Meta reported implementation of this recommendation but has not published any information to demonstrate this.

In cases of automated moderation, the Board has urged Meta to implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes ( Breast Cancer Symptoms and Nudity, recommendation no. 5), which Meta has reported implementation on.

The Board believes that full implementation of these recommendations could reduce the number of enforcement errors of Meta’s Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions