Overturned

Report on Violent Attack in Sri Lanka

A user appealed Meta’s decision to remove a Facebook post consisting of a link to a news article about a gunman firing at a church in Sri Lanka.

Type of Decision

Summary

Policies and Topics

Topic
News events, Safety, Violence
Community Standard
Violent and graphic content

Region/Countries

Location
Sri Lanka

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to remove a Facebook post consisting of a link to a news article about a gunman firing at a church in Sri Lanka. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

About the Case

In April 2025, a Facebook user posted a link to a news article in English about a gunman firing at a church in Sri Lanka. The article was shared in a group self-described as dedicated to “news and discussions” on conflict and peacebuilding. The link autogenerated a pop-up headline and caption which included a description of the country being on high alert six years after Easter Sunday bombings killed hundreds of people. It also featured an image of the aftermath of a bombing in April 2019 at St. Sebastian’s Church in the city of Negombo.

Meta initially removed the content under its Violent and Graphic Content Community Standard. Later, the company clarified that the relevant policy for the removal of content in this case is the Violence and Incitement Community Standard. Under this policy, the company removes “threats of violence against various targets,” defined as “statements or visuals representing an intention, aspiration or call for violence against a target.” However, Meta allows threats “when shared in awareness-raising or condemning context.”

In their appeal to the Board, the user noted that the group to which the article was posted is designed for people to learn about escalations and de-escalations of conflicts worldwide, to help peace building and resolution.

After the Board brought this case to Meta’s attention, the company determined that the content should not have been removed. Meta highlighted that the Violence and Incitement policy allows content shared to raise awareness about a threat of violence by others. In this case, the company considered that the post was raising awareness since the content was a link to a news report about a gunman who opened fire at a place of worship in Sri Lanka. Meta also explained that the news link was shared to a Facebook Page a day after the event occurred, which suggested that the intent was to inform others of the event. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights the overenforcement of Meta’s Violence and Incitement Community Standard. The company’s content moderation systems failed to recognize the awareness-raising context, directly impacting the posting user’s ability to share information about attacks against places of worship in Sri Lanka. The Board notes that the description of the group where the user posted the article highlights that it is a forum for news and discussions involving peaceful conflict resolution. The system’s failure also impacted other users’ ability to access this piece of information.

The Board has previously issued relevant recommendations aiming to increase enforcement accuracy, tracking and transparency around the exceptions to Meta’s Violence and Incitement policy. Firstly, the Board has recommended that Meta, “provide the Board with the data that it uses to evaluate its policy enforcement accuracy” to “inform future assessments and recommendations to the Violence and Incitement policy” ( United States Posts Discussing Abortion, recommendation no. 1). The implementation of this recommendation is currently in progress. In its H2 2024 report, Meta stated, “We are in the process of compiling an overview of Community Standards enforcement data to share confidentially with the Board. This document will outline data points that gauge our enforcement accuracy across various policies, including the Violence and Incitement policy.”

The Board has also recommended that Meta “add to the public-facing language of its Violence and Incitement Community Standard that the company interprets the policy to allow content containing statements with ‘neutral reference to a potential outcome of an action or an advisory warning,’ and content that ‘condemns or raises awareness of violent threats’” ( Russian Poem, recommendation no. 1). Partial implementation of the recommendation was demonstrated through published information. Meta updated its Violence and Incitement policy (see Meta’s Q4 2023 Report) to include, “We do not prohibit threats when shared in awareness-raising or condemning context, when less severe threats are made in the context of contact sports or when threats are directed against certain violent actors, like terrorist groups.” However, Meta noted in its Q3 2023 Report that the company decided against adding specific language regarding, “neutral references to a potential outcome of an action or an advisory warning, where the poster is not making a violent threat” as this is “generally covered in the rationale of the policy and the definition and framing of a threat.”

Additionally, the Board has recommended that “to better inform users when policy exceptions could be granted, Meta should create a new section within each Community Standard detailing what exceptions and allowances apply. When Meta has specific rationale for not allowing certain exceptions that apply to other policies (such as news reporting or awareness raising), Meta should include that rationale in this section of the Community Standard” ( News Documentary on Child Abuse in Pakistan, recommendation no. 1). The implementation of this recommendation is currently in progress. In its initial response to the Board in 2024, Meta stated that the company would “consider updates to [its] Community Standards to provide more clarity regarding relevant allowances that may apply to specific policy sections and rationales for why they do or do not apply.” Meta also explained that “given the broad and complex scope” of the recommendation, the implementation would take time.

The repeated overenforcement of Meta’s Violence and Incitement policy undermines the ability of users to share news reporting and information about violent attacks. The Board believes that the full implementation of recommendation no. 1 from the United States Posts Discussing Abortion case would further enhance Meta’s ability to identify and address enforcement accuracy shortcomings, thereby reducing the number of enforcement errors. Moreover, the full implementation of the recommendations on transparency around exceptions to the Violence and Incitement policy would prevent – and reverse – erroneous takedowns by increasing both users’ and content reviewers’ awareness around policy exceptions.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company's attention.

Return to Case Decisions and Policy Advisory Opinions