Overturned
Footage of Massacres in Syria
May 13, 2025
A user appealed Meta’s decision to remove a Facebook post of a video containing graphic footage of violence in Syria.
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta’s decision to remove a Facebook post of a video containing graphic footage of violence in Syria. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post with a warning screen.
About the Case
In December 2024, a user posted a video on Facebook featuring violent scenes, including beatings and stabbings, individuals being lit on fire, and injured and deceased people, including children. The caption above the video, written in Arabic, describes how the content shows scenes of massacres [participated in] by the “criminals of the Party of Satan in Syria,” and makes claims that the people have not forgotten and will not forget the crimes that were committed. According to established news reporting, “Party of Satan” appears to be a reference to Hezbollah.
Under Meta’s Violent and Graphic Content Community Standard, Meta removes “the most graphic content and add[s] warning labels to other graphic content so that people are aware it may be sensitive or disturbing before they click through.” “Videos of people, living or deceased, in non-medical contexts” depicting “dismemberment,” “visible innards,” “burning or charred persons” or “throat-slitting” are, therefore, not allowed. Meta also notes that for “Imagery (both videos and still images) depicting a person’s violent death (including their moment of death or the aftermath) or a person experiencing a life-threatening event,” the company applies a warning screen so that people are aware that the content may be disturbing. In these instances, the company also limits the ability to view the content to adults aged 18 and older. Moreover, when provided with additional context, Meta may allow graphic content “in order to shed light on or condemn acts such as human rights abuses or armed conflict” to “allow room for discussion and awareness raising.”
After the Board brought this case to Meta’s attention, the company determined that the content should not have been removed under the Violent and Graphic Content policy. The company then restored the content to Facebook with a warning screen.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
This case illustrates continuing issues with Meta’s ability to moderate content that raises awareness of and documents grave human rights violations. Despite language in Meta’s Violent and Graphic Content policy acknowledging that users may share content to shed light on or condemn acts such as “human rights abuses or armed conflict”, the company continues to remove content from its platforms that aims to accomplish precisely this.
The Board has issued recommendations to guide Meta in its enforcement practices for graphic or violent content that is shared in the context of condemnation or to raise awareness with the intent of making these allowances enforceable at scale and not only in exceptional circumstances. For example, in the Sudan Graphic Video case decision, the Board recommended that “Meta should amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purposes of raising awareness or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing.” ( Sudan Graphic Video, recommendation no. 1). The recommendation was declined after a feasibility assessment. Meta reported undertaking policy development on the subject and ultimately decided to keep the status quo to “remove content by default, but allow content with a warning label when there is additional context”( Meta Q4 2023 Quarterly Update on the Oversight Board).
Additionally, the Board previously recommended that “Meta should add to the public-facing language of its Violent and Graphic Content Community Standard detail from its internal guidelines about how the company determines whether an image “shows the violent death of a person or people by accident or murder”. ( Russian Poem, recommendation no. 2). Meta demonstrated partial implementation of this recommendation through published information- the company updated the language in its Violent and Graphic Content Community Standard, by including language in parentheticals to clarify what the company means by “violent death.” The language now reads, “Imagery (both videos and still images) depicting a persons' violent death (including their moment of death or the aftermath).” Meta's extended definition did not, however, sufficiently explain how the company determines whether an image “shows the violent death of a person or people by accident or murder.”
Furthermore, the Board has recommended that Meta “improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard” ( Punjabi Concern over the RSS in India, recommendation no. 3). The Board underscored, in this recommendation, that “more detailed transparency reports will help the public spot areas where errors are more common, including potential specific impacts on minority groups.” The implementation of this recommendation is currently in progress. In its last update on this recommendation, Meta explained that the company is “in the process of compiling an overview of enforcement data to confidentially share with the Board.” The document will outline data points that provide indicators of enforcement accuracy across various policies. Meta stated that the company “remain[s] committed to compiling an overview that addresses the Board’s overarching call for increased transparency on enforcement accuracy across policies” (Meta’s H2 2024 Bi-Annual Report on the Oversight Board – Appendix).
The Board stresses the importance of Meta continuing to improve its ability to accurately detect content that seeks to raise awareness about or to condemn human rights abuses and to keep such content, with a warning screen, on the platform under the company’s Violent and Graphic Content policy.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.