Multiple Case Decision

Graphic Content in Awareness-Raising Context

Four users appealed Meta's decision to remove their Facebook and Instagram posts that depicted graphic content in an awareness-raising context.

4 cases included in this bundle

Overturned

FB-PW4MPO8J

Case about violent and graphic content on Facebook

Platform
Facebook
Topic
News events,Safety,War and conflict
Standard
Violent and graphic content
Location
Mexico
Date
Published on April 21, 2026
Overturned

FB-GKN5CHBI

Case about violent and graphic content on Facebook

Platform
Facebook
Topic
News events,Safety,War and conflict
Standard
Violent and graphic content
Location
Mexico
Date
Published on April 21, 2026
Overturned

FB-S5ZFX6W8

Case about violent and graphic content on Facebook

Platform
Facebook
Topic
News events,Safety,War and conflict
Standard
Violent and graphic content
Location
Russia,Ukraine
Date
Published on April 21, 2026
Overturned

IG-IA95C08R

Case about violent and graphic content on Instagram

Platform
Instagram
Topic
News events,Safety,War and conflict
Standard
Violent and graphic content
Location
Brazil
Date
Published on April 21, 2026

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

Four users appealed Meta's decision to remove their Facebook and Instagram posts that depicted graphic content in an awareness-raising context. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored all four posts.

About the Cases

In August and September 2025, four users from three different countries (Mexico, Ukraine and Brazil) posted content with graphic images in an awareness-raising context that were initially removed by Meta for violating the Violent and Graphic Content policy.

In the first case, the user posted a video on Facebook showing an individual holding a bucket of water near several people with visible burn injuries, apparently in the aftermath of the explosion of a gas truck in Mexico City. The caption, translated from Spanish, reads: "Mistakes you should not make." In the audio, a person explains that the individual with the bucket intended to pour water on the burn victims, which would have introduced bacteria to their exposed skin. In their appeal to the Board, the user stated that the video was about an explosion in Mexico.

The second case involves a Facebook post with a video showing injured victims seemingly from the same explosion in Mexico City. As the video focuses on some of the victims, who are naked, it darkens and becomes more obscure. The caption, translated from Spanish, reads "Strong images" followed by hashtags such as "#fire" and "#news." In their appeal to the Board, the user stated they were "informing the public about important issues."

In the third case, a news organization posted a video that, in two different moments, features an image of a slit throat. The caption, translated from Ukrainian, reads: "With his throat cut, [name] crawled for five days from captivity to Ukrainian positions [...] Together with his comrades, he fell into an ambush by Russian soldiers." The post also claims that the combatant and his comrades were tortured while in captivity. In their appeal to the Board, the news organization explained that the video was published as "important evidence of torture and severe human rights violations," and "serves the public interest by documenting crimes and raising awareness about [their] consequences."

In the fourth case, an Instagram user posted a video of an individual peeling the skin on their hand to reveal exposed flesh. The caption, translated from Portuguese, provides information about the risks of infection and fluid deficiency associated with a loss of skin continuity. It emphasizes the importance of proper care for skin lesions. The caption also contains hashtags related to nursing and healthcare. In their appeal to the Board, the user stated that they are a teacher specializing in dermatology and that their content is meant to be educational for students and healthcare professionals.

Under the Violent and Graphic Content policy, Meta prohibits "videos of people, living or deceased, in non-medical contexts, depicting [...] visible innards, such as exposed organs, bones, or muscle tissue on living or deceased persons; burning or charred persons; or throat-slitting."

After the Board selected the cases, Meta determined that it had incorrectly removed all four posts. In the first case, Meta concluded that, while the injuries are visible, the content is not violating the policy because it does not depict redness, blistering, sloughing or weeping fluid. In the second case, Meta reached the same conclusion, this time because the injuries are not visible as a result of the image getting darker as they come into focus. Regarding the third case, Meta explained that, for still images that show a slit throat it applies "a warning screen to indicate that the content is sensitive and restricts visibility to those aged 18 and over." Therefore, Meta concluded that the content should be allowed provided it is placed behind a warning screen with visibility limited to adults. For the fourth case, Meta observed that "the content depicts surface-level flesh with no visible organs, bones or muscle tissue," adding that "the burns do not appear severe enough to constitute a violation" of the Violent and Graphic Content policy. Consequently, Meta restored all four posts, with an age-gated warning screen being applied to the content in the third case.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Cases

This bundle provides several examples of the overenforcement of Meta’s Violent and Graphic Content Community Standard, showing how it may impact awareness raising around news events involving matters of public interest, health-related issues and human rights abuses.

In Sudan Graphic Video, the Board noted that although the policy rationale of the Violent and Graphic Content Community Standard states that Meta allows awareness raising, the specific rules within the Community Standard do not include a "raising awareness" exception. The Board recommended that "Meta should amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing" ( Sudan Graphic Video, recommendation no. 1). The Board considers this recommendation to have been reframed by Meta. The company stated: "Last year, we invited the Board to attend a Policy Forum related to this recommendation [...] Ultimately, given the feedback received during this policy development, we aligned on the status quo policy to remove content by default, but allow content with a warning label when there is additional context. While this does not amend the Community Standard to allow videos of people or dead bodies when shared for the purposes of raising awareness of or documenting human rights abuses at scale, it does continue to allow this content to be assessed upon escalation" (Meta’s Q4 2023 Quarterly Update on the Oversight Board).

In relation to the third case, which shows the result of apparent violence against a captured Ukrainian soldier, the Board has previously recommended that "Meta should commit to preserving, and where appropriate, sharing with competent authorities evidence of atrocity crimes or grave human rights violations, such as those specified in the Rome Statute of the International Criminal Court, by updating its internal policies to make clear the protocols it has in place in this regard" ( Armenian Prisoners of War Video, recommendation no. 1). In the same decision, the Board also recommended that "following the development of the protocol on evidence preservation related to atrocity crimes and grave human rights violations, Meta should publicly share this protocol in the Transparency Center" ( Armenian Prisoners of War Video, recommendation no. 4). Meta reported implementation of these recommendations but did not publish information to demonstrate this. In response to both recommendations, Meta stated: "We briefed the Oversight Board [...] to share details about our new approach to retaining potential evidence of atrocity crimes and serious violations of international human rights law" (Meta’s H1 2024 Bi-Annual Update on the Oversight Board).

The Board believes that full implementation of the recommendations mentioned above would allow users to more effectively raise awareness around news events involving matters of public interest, health-related issues and human rights abuse.

Decision

The Board overturns Meta’s original decisions to remove the four pieces of content. The Board acknowledges Meta’s correction of its initial errors once the Board brought the cases to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions