Overturned

Threat of Violence Against the Rohingya People

A user appealed Meta’s decision to leave up a comment under a Facebook post claiming the Rohingya people cause disturbances and are “tricksters.” The comment also called for the implementation of control measures against them.

Type of Decision

Summary

Policies and Topics

Topic
Marginalized communities, Race and ethnicity, War and conflict
Community Standard
Violence and incitement

Region/Countries

Location
Myanmar

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to leave up a comment under a Facebook post claiming the Rohingya people cause disturbances and are “tricksters.” The comment also called for the implementation of control measures against them as well as for their “total erasure.” This case highlights a recurring issue in the under-enforcement of the company’s Violence and Incitement policy, specifically regarding threats against vulnerable groups. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

About the Case

In January 2024, a Facebook user commented on a post about the Rohingya people in Myanmar. The comment included a caption above an image of a pig defecating. In the caption, the user writes that “this group” (in reference to the Rohingya) are “tricksters” who “continue to cause various social problems.” The caption argues that the Myanmar government has taken the right course of action in “curbing” the Rohingya and calls for their “absolute erasure from the face of the Earth” for the sake of “national security and well-being.”

In their statement to the Board, the user who appealed wrote that they live in Myanmar and expressed frustration at Meta’s lack of action against comments calling for genocide, such as the one in this case. They explained how they have witnessed first-hand how Meta’s inability to effectively moderate hate speech against the Rohingya people has led to offline violence, and how the Rohingya people are languishing in refugee camps.

According to Meta’s Violence and Incitement policy, the company prohibits, “threats of violence that could lead to death (or other forms of high-severity violence).” After the user appealed to Meta, the company initially left the content on the platform.

After the Board brought this case to Meta’s attention, the company determined the comment “advocates lethal violence through ‘erasing’ Rohingya people from the Earth” and therefore violates the Violence and Incitement Community Standard. The company then removed the content from the platform.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights issues in Meta’s repeated under-enforcement of violent and incendiary rhetoric against the Rohingya people. This is a well-known and recurring problem: in 2018, Meta commissioned an independent human rights assessment to ascertain the degree to which the company played a role in exacerbating disinformation campaigns and prejudice against the Rohingya. Meta’s inability to moderate content that endorses genocide and promotes ethnic cleansing of the marginalized Rohingya population has been documented by other civil society groups, such as Amnesty International in a report detailing Meta’s role in the atrocities committed against the community.

In a previous decision, the Board recommended that “Meta should rewrite [its] value of ‘safety’ to reflect that online speech may pose risk to the physical security of persons and the right to life, in addition to the risks of intimidation, exclusion and silencing,” ( Alleged Crimes in Raya Kobo, recommendation no. 1). Meta has completed implementing this recommendation in part.

The Board urges Meta to improve its detection and enforcement of speech that calls for violence against the Rohingya people.

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions