This website is currently undergoing maintenance and will be back soon.

Overturned

Media Conspiracy Cartoon

A user appealed Meta’s decision to leave up a Facebook comment which is an image depicting a caricature of a Jewish man holding a music box labelled “media,” while a monkey labelled “BLM” sits on his shoulder.

Type of Decision

Summary

Policies and Topics

Topic
Discrimination, Marginalized communities, Race and ethnicity
Community Standard
Hate speech

Region/Countries

Location
Australia, Germany, Israel

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors, and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to leave up a Facebook comment which is an image depicting a caricature of a Jewish man holding a music box labelled “media,” while a monkey labelled “BLM” sits on his shoulder. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the comment.

Case Description and Background

In May 2023, a user posted a comment containing an image which depicts a caricature of a Jewish man holding an old-fashioned music box, while a monkey rests on his shoulders. The caricature has an exaggerated hooked nose and is labelled with a Star of David inscribed with “Jude,” resembling the badges Jewish people were forced to wear during the Holocaust. The monkey on his shoulder is labelled with “BLM,” (the acronym for the “Black Lives Matter” movement) while the music box is labelled with “media.” The comment received fewer than 100 views.

This content violates two separate elements of Meta’s Hate Speech policy. Meta’s Hate Speech policy prohibits content which references “harmful stereotypes historically linked to intimidation," such as, “claims that Jewish people control financial, political, or media institutions.” Furthermore, Meta’s Hate Speech policy forbids dehumanizing imagery, such as content which equates “Black people and apes or ape-like creatures.” This content violates both elements as it insinuates that Jewish people control media institutions and equates “BLM” with a monkey. In their appeal to the Board, the user who reported the content stated that the content was antisemitic and racist towards Black people.

Meta initially left the content on Facebook. When the Board brought this case to Meta’s attention, the company determined that the post violated its Hate Speech policy, and that its original decision to leave up the content was incorrect. The company then removed the content from Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case that is under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, to reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

The case highlights gaps in Meta’s enforcement of its Hate Speech policy, which can lead to the spread of content which promotes harmful stereotypes and dehumanizing imagery. Enforcement errors such as these need to be corrected given Meta’s responsibility to mitigate the risk of harm associated with content which targets marginalized groups.

The Board has previously examined under-enforcement of Meta's Hate Speech policy where user content made implied clear violations of the company’s Standards. The Board recommended that Meta, “clarify the Hate Speech Community Standard and the guidance provided to reviewers, explaining that even implicit references to protected groups are prohibited by the policy when the reference would reasonably be understood” ( Knin Cartoon decision, recommendation no. 1). Meta partially implemented this recommendation.

The Board has also issued recommendations that aim at reducing the number of enforcement errors. The Board recommended that Meta, “implement an internal audit procedure to continuously analyse a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes” ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta states that this recommendation is work Meta already does, without publishing information to demonstrate this.

The Board reiterates that full implementation of the recommendations above will help to decrease enforcement errors under the Hate Speech policy, reducing the prevalence of content which promotes offensive stereotypes and dehumanizing imagery.

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought this case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions