Overturned

Hateful Memes Video Montage

A user appealed Meta’s decision to leave up a Facebook post in which a video montage, set to German music, contains a series of antisemitic, racist, homophobic and transphobic memes. This case highlights errors in Meta’s enforcement of its Hate Speech policy.

Type of Decision

Summary

Policies and Topics

Topic
Race and ethnicity, Religion, Sex and gender equality
Community Standard
Hate speech

Region/Countries

Location
United States

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not consider public comments and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to leave up a Facebook post in which a video montage, set to German music, contains a series of antisemitic, racist, homophobic and transphobic memes. This case highlights errors in Meta’s enforcement of its Hate Speech policy. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

Case Description and Background

In August 2023, a Facebook user posted a three-minute video clip containing a series of antisemitic, racist, homophobic and transphobic memes. The memes, among other things, allege Jewish control of media institutions, praise the Nazi military, express contempt toward interracial relationships, compare Black people to gorillas, display anti-Black and anti-LGBTQIA+ slurs, and advocate for violence against these communities. The accompanying caption, in English, claims that the post would get the user’s Facebook page suspended but that it would be “worth it.” The post was viewed approximately 4,000 times and reported fewer than 50 times.

This content violates several elements of Meta’s Hate Speech policy, which prohibits content that references “harmful stereotypes historically linked to intimidation,” such as “claims that Jewish people control financial, political or media institutions.” Furthermore, the policy forbids dehumanizing imagery, such as content that equates “Black people and apes or ape-like creatures.” Additionally, the policy forbids the use of racialized slurs. The memes in this content violate the above elements by alleging Jewish control of the media (one meme shows a Kippah (Jewish cap) with “facebook” written on it); comparing Black people to gorillas; and using racialized slurs by displaying the n-word on a sword wielded by a cartoon character. The accompanying caption, in English, claims that the post would get the user’s Facebook page suspended but that it would be “worth it.” The caption also calls on those interacting with the content to “show these degenerates your utter contempt” and to download the video.

Meta initially left the content on Facebook. After the Board brought this case to Meta’s attention, the company determined that the content did violate the Hate Speech Community Standard and removed the content.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights errors in how Meta enforces its Hate Speech policy. The Board has previously examined Meta’s Hate Speech policy where user content used slurs, such as in the South Africa Slurs case in which the Board determined the use of a racial slur was degrading and exclusionary, and for content referring to a group of people as subhuman, such as in the Knin Cartoon case.

The content in this case contained multiple instances of violating content, from the use of slurs to target Black people to accusing Jewish people of controlling the media. While the caption indicates the user is acutely aware their content is likely to be violating and removed for hateful speech, the content was not removed until the Board identified the case for review based on another user’s appeal. The Board has also published summary decisions illustrating that Meta continues to have difficulty with enforcing hate speech, as shown in the Planet of the Apes Racism and Media Conspiracy Cartoon cases with regard to speech against Black and Jewish people respectively.

Previously, the Board has noted in its Post in Polish Targeting Trans People case that Meta’s failures to take the correct enforcement action, despite multiple signals about a post’s harmful content, led the Board to conclude the company is not living up to the ideals it has articulated on the safety of LGBTQIA+ and other marginalized communities. The Board urges Meta to close enforcement gaps under the Hate Speech Community Standard.

Decision

The Board overturns Meta's original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions