Overturned

Link to Wikipedia Article on Hayat Tahrir al-Sham

A user appealed Meta’s decision to remove a reply to a Facebook comment that included a link to a Wikipedia article about Hayat Tahrir al-Sham (HTS). After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the content.

Type of Decision

Summary

Policies and Topics

Topic
Freedom of expression, Governments, War and conflict
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Syria

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to remove a reply to a Facebook comment that included a link to a Wikipedia article about Hayat Tahrir al-Sham (HTS). After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the content.

About the Case

In December 2024, amid a rebel offensive that led to the fall of Bashar al-Assad's regime in Syria, a Facebook user in Macedonia posted about the former Syrian president fleeing to Moscow. Another user commented with a quote in Bulgarian, referring to Hayat Tahrir al-Sham (HTS) as “Islamists from Al-Qaeda.” A third user then replied to the comment in Bulgarian, stating that this is one of the groups that had driven Assad away and included a link to a Wikipedia article about HTS.

Meta originally removed the reply from the third user from Facebook under its Dangerous Organizations and Individuals (DOI) policy. This policy prohibits the “glorification,” “support” and “representation” of designated entities, their leaders, founders, prominent members and any unclear references to them. However, the policy allows neutral discussion, including “factual statements, commentary, questions, and other information that do not express positive judgement around the designated dangerous organisation or individual and their behaviour.”

The third user’s appeal to the Board stated that they shared the link for informational purposes and they “do not support the organization, on the contrary, [they] condemn it.”

After the Board brought this case to Meta’s attention, the company determined that the content did not violate its DOI policy. It found that its original decision to remove the content was incorrect because the comment does not express “positive judgment” about Hayat Tahrir al-Sham. The company restored the content to the platform.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights the over-enforcement of Meta’s Dangerous Organizations and Individuals policy. The Board previously noted in the Karachi Mayoral Election Comment decision that such mistakes can negatively impact users' ability to “share political commentary and news reporting” about organizations labeled as “dangerous,” therefore infringing on freedom of expression.

The Board has issued several recommendations aiming to increase transparency around and the accuracy of the enforcement of Meta’s Dangerous Organizations and Individuals policy and its exceptions. This includes a recommendation to “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors.” ( Mention of the Taliban in News Reporting, recommendation no. 5). While Meta reported it had implemented this recommendation, the company did not publish information to demonstrate this.

In the same decision, the Board recommended that Meta “enhance the capacity allocated to HIPO [High Impact False Positive Override system] review across languages to ensure that more content decisions that may be enforcement errors receive additional human review” ( Mention of the Taliban in News Reporting, recommendation no. 7). HIPO is a system Meta uses to identify cases in which it has acted incorrectly, for example, by wrongly removing content. Meta reported exploring improvements to increase HIPO’s review capacity, which resulted in a “multifold increase in HIPO overturns” (Meta Q4 2022 Quarterly Update on the Oversight Board). The Board considered that this recommendation has been reframed by Meta, given that it is unclear from the company’s response whether the changes involved resource increases or only reallocation for better efficiency.

In the Punjabi Concern Over the RSS in India decision, the Board recommended that Meta “improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard.” The Board underscored that “more detailed transparency reports will help the public spot areas where errors are more common, including potential specific impacts on minority groups” (recommendation no. 3). The implementation of this recommendation is currently in progress. In its last update on this recommendation, Meta explained that the company is “in the process of compiling an overview of enforcement data to confidentially share with the Board.” The document will outline data points that provide indicators of enforcement accuracy across various policies – including the Dangerous Organizations and Individuals policy. Meta stated that the company “remain[s] committed to compiling an overview that addresses the Board’s overarching call for increased transparency on enforcement accuracy across policies” (Meta’s H2 2024 Bi-Annual Report on the Oversight Board – Appendix).

Furthermore, in a policy advisory opinion, the Board asked Meta to “explain the methods it uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy,” (Referring to Designated Dangerous Individuals as “Shaheed,” recommendation no. 6). The Board considered that this recommendation has been reframed by Meta. The company stated it conducts audits to assess the accuracy of its content moderation decisions and that this informs areas for improvement. Meta did not, however, explain the methods it deploys to perform these assessments.

The Board urges Meta to continue to improve its ability to accurately enforce content that falls within the exceptions to the Dangerous Organizations and Individuals policy. A full commitment to the recommendations mentioned above would further strengthen the company’s ability to improve enforcement accuracy.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions