Overturned

Syria Protest

An Instagram user appealed Meta's decision to remove a video that encouraged Syrians to resist the regime of Bashar al-Assad. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Type of Decision

Summary

Policies and Topics

Topic
Community organizations, Freedom of expression, Protests
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Syria, United States

Platform

Platform
Instagram

This is a summary decision. Summary decisions examine cases in which Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not consider public comments and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

An Instagram user appealed Meta's decision to remove a video that encouraged Syrians to resist the regime of Bashar al-Assad. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In August 2023, an Instagram user posted a video showing Abdul Baset al-Sarout, a Syrian football player, activist and a public symbol of opposition to the country's president, Bashar al-Assad. Sarout was killed in 2019. In the video, Sarout is heard saying in Arabic, "We have one liberated neighborhood in Syria, we are a thorn in this regime, we will return to this neighborhood" and that "the revolution continues," to encourage Syrians to resist the regime of Bashar al-Assad. The video had about 30,000 views.

The Instagram post was removed for violating Meta’s Dangerous Organizations and Individuals policy, which prohibits representation of and certain speech about the groups and people the company judges as linked to significant real-world harm.

In their appeal to the Board, the user described their account as a non-profit page “dedicated to spreading information and raising awareness.” Furthermore, the user argued the content did not violate Instagram’s guidelines.

After the Board brought this case to Meta’s attention, the company determined the content did not contain any references to a designated organization or individual and did not violate its policies. The company restored the content to the platform.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights the incorrect removal of content that did not contain any references to a designated organization or individual.

In order to reduce enforcement errors in places experiencing conflict or other sensitive circumstances, the Board has recommended that Meta “enhance the capacity allocated to HIPO [high-impact false positive override system] review across languages to ensure that more content decisions that may be enforcement errors receive additional human review,” on which Meta has reported progress on implementation ( Mention of the Taliban in News Reporting, recommendation no. 7). The Board has also recommended that Meta “evaluate automated moderation processes for enforcement of the Dangerous Organizations and Individuals policy," ( Öcalan’s Isolation, recommendation no. 2), which Meta has declined.

This case highlights over-enforcement of Meta’s Dangerous Organizations and Individuals policy. The Board’s cases suggest that errors of this sort are all too frequent. The company should make reducing such errors a high priority. Full adoption of these recommendations, along with the published information to demonstrate successful implementation, could reduce the number of incorrect removals under Meta’s Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions