Overturned

Dehumanizing Comments About People in Gaza

A user appealed Meta’s decision to leave up a Facebook post claiming that Hamas originated from the population of Gaza, comparing them to a “savage horde.” After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

Type of Decision

Summary

Policies and Topics

Topic
Marginalized communities, Race and ethnicity, War and conflict
Community Standard
Hate speech

Region/Countries

Location
Israel, Palestinian Territories

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to leave up a Facebook post claiming that Hamas originated from the population of Gaza and reflects their “innermost desires,” comparing them to a “savage horde.” After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

About the Case

In December 2023, a user reposted an image on Facebook featuring text, alongside the image of an unnamed man, expressing the view that the “general public” in Gaza is not the “victim” of Hamas but rather, that the militant group emerged as a “true reflection” of “the innermost desires of a savage horde.” The reposted image contained an endorsing caption that included the words, “the truth.” The post was viewed fewer than 500 times.

Under Meta’s Hate Speech policy, Meta prohibits content targeting a person or group of people on the basis of their protected characteristics, specifically mentioning comparisons to “sub humanity” and including “savages” as an example. In this content, the reference to “the general public of Gaza” is an implicit reference to Palestinians in Gaza, thus targeting the protected characteristics of ethnicity and nationality.

In a statement appealing this case to the Board, the user noted that the post “constituted dehumanizing speech,” by generalizing about the people of Gaza.

After the Board brought this case to Meta’s attention, the company determined that the content did violate Meta’s Hate Speech policy and its original decision to leave the content up was incorrect. The company then removed the content from Facebook.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user who reported content that was left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process involved, reduce errors and increase fairness for Facebook and Instagram users.

Significance of Case

This case highlights errors in Meta’s enforcement of its Hate Speech policy, specifically relating to content that attacks people based on their protected characteristics. Moderation errors are especially harmful in times of ongoing armed conflict. As such, there should have been more robust content moderation practices in place.

The Knin Cartoon case similarly contained hate speech targeted at a protected characteristic – an ethnicity – referring to one ethnic group as rats without explicitly naming them. However, the Knin Cartoon case required historical and cultural context to interpret the symbolic portrayal of an ethnic group, whereas the content in this case more directly ties dehumanizing comments to an entire population, which should reasonably be understood as referring to people by protected characteristic.

In the Knin Cartoon decision, the Board recommended that Meta should “clarify the Hate Speech Community Standard and the guidance provided to reviewers, explaining that even implicit references to protected groups are prohibited by the policy when the reference would reasonably be understood,” ( Knin Cartoon decision, recommendation no. 1), which Meta has reported as partially implemented. In Q4 of 2022, Meta reported that they “added language to the Community Standards and reviewers’ policy guidance clarifying that implicit hate speech will be removed if it is escalated by at-scale reviewers to expert review where Meta can reasonably understand the user’s intent.” The Board considers this recommendation partially implemented as updates were not made to the Hate Speech Community Standard, but only to the general introduction of the Community Standards.

The Board believes that full implementation of this recommendation would reduce the number of enforcement errors under Meta’s Hate Speech policy.

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions