Multiple Case Decision
Thai Hostage Negotiator Interview
April 18, 2024
The Board reviewed two Facebook posts containing near identical segments of a Sky News video interview with a Thai hostage negotiator describing his experience of working to free hostages captured by Hamas. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored each of the posts.
2 cases included in this bundle
FB-XO941WWQ
Case about dangerous individuals and organizations on Facebook
FB-U3Y5VV2E
Case about dangerous individuals and organizations on Facebook
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
The Board reviewed two Facebook posts containing near identical segments of a Sky News video interview with a Thai hostage negotiator describing his experience of working to free hostages captured by Hamas. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored each of the posts.
About the Cases
In December 2023, two users appealed Meta’s decisions to remove their posts, each containing near identical clips of an interview broadcast by Sky News in November 2023. This video features a negotiator from Thailand who led an unofficial team of negotiators and helped secure the release of Thai nationals taken hostage by Hamas on October 7 in Israel.
In the clip, the interviewee describes his part in the negotiations. He says he believes the Thai hostages and all hostages “were well taken care of” by Hamas since they follow Islamic law and that Hamas had set no conditions on the Thai captives’ release.
The negotiator, who sympathizes with Palestinian people, cites decades of what he described as Israeli mistreatment of Palestinians in the Occupied Territories. He asserts that Hamas was “targeting soldiers” and affirms Hamas was justified in taking hostages “to help the Palestinians” and “to get the world’s attention focused on the Israeli treatment of Palestinians.”
In their appeals to the Board, both users said they posted the video to bring attention to the Thai negotiator’s statements, but for different reasons. One user said their intent was to highlight an interview that “shows Hamas in a more balanced light” in contrast to common attitudes of the “Western propaganda machine.” The user indicates in the caption that the negotiator is refusing to stick to the narrative and further explains that their post was previously censored because the content mentioned a particular political organization. On the second post, the user, who had posted the video without a caption or further commentary, said they were “calling out collaborators who lie and manipulate” in support of Hamas.
Meta initially removed the posts from Facebook, citing its Dangerous Organizations and Individuals policy under which the company prohibits “glorification” (previously “praise”), “support” and “representation” of individuals and organizations it designates as dangerous. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals in the context of social and political discourse. This includes content reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities.”
Once the Board brought these cases to Meta’s attention, the company determined that the posts do “not include any captions that glorify, support, or represent a dangerous organization or individual.” Additionally, the video was “previously shared by Sky News, and other news outlets, on Facebook… and [therefore] falls under the scope of its news reporting carveout.” Meta restored both posts.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.
Significance of Cases
These cases emphasize the persistent over-enforcement of the company’s Dangerous Organizations and Individuals Community Standard, as highlighted in previous decisions by the Board. Continued errors in applying this policy reduce users’ access to neutral commentary, news reporting and condemnatory posts, to the detriment of freedom of expression.
In a previous decision, the Board urged Meta to “include more comprehensive data on Dangerous Organizations and Individuals Community Standarderror rates in its transparency report,” ( Öcalan’s Isolation, recommendation no. 12), which the company declined to implement after a feasibility assessment.
Furthermore, the Board recommended the inclusion of “criteria and illustrative examples to Meta’s Dangerous Organizations and Individuals Community Standard policy to increase understanding of exceptions, specifically around neutral discussion and news reporting,” in order to provide greater guidance to human reviewers ( Shared Al Jazeera Post, recommendation no. 1). This recommendation is particularly relevant as it concerned the removal of a news post about a threat of violence from the Izz al-Din al-Qassam Brigades, the military wing of the Palestinian group Hamas. The Board has also advised Meta to “assess the accuracy of reviewers enforcing the reporting allowance… to identify systemic issues causing enforcement errors,” ( Mention of Taliban in News Reporting, recommendation no. 5). While Meta has reported implementation on both recommendations, for Mention of the Taliban in News Reporting, recommendation no. 5, they have not published information to demonstrate this.
The Board has recommended improvements to moderation of posts containing videos, calling on the company to adopt “product and/or operational guideline changes that allow more accurate review of long form videos ,” ( Cambodian Prime Minister, recommendation no. 5). In response, Meta stated that it would “continue to iterate on new improvements for our long-form video review processes and metric creation and evaluation” in its Q3 2023 Quarterly Update on the Oversight Board.
The Board believes that full implementation of these recommendations could reduce the number of enforcement errors of Meta’s Dangerous Organizations and Individuals policy.
Decision
The Board overturns Meta’s original decisions to remove the content. The Board acknowledges Meta’s correction of its initial error s once the Board brought these cases to Meta’s attention.