Multiple Case Decision

Negative Stereotypes of African Americans

The Board reviewed three Facebook posts containing racist material, which Meta left up. After the Board brought these appeals to Meta’s attention, the company reversed its original decisions and removed the posts.

3 cases included in this bundle

Overturned

FB-LHBURU6Z

Case about hate speech on Facebook

Platform
Facebook
Topic
Discrimination,Race and ethnicity
Standard
Hate speech
Location
United States
Date
Published on April 18, 2024
Overturned

FB-1HX5SN1H

Case about hate speech on Facebook

Platform
Facebook
Topic
Discrimination,Race and ethnicity
Standard
Hate speech
Location
United Kingdom,United States
Date
Published on April 18, 2024
Overturned

FB-ZD01WKKW

Case about hate speech on Facebook

Platform
Facebook
Topic
Discrimination,Race and ethnicity
Standard
Hate speech
Location
United States
Date
Published on April 18, 2024

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

The Board reviewed three Facebook posts containing racist material, which Meta left up. Each post included a caricature or manipulated image of African Americans that highlights offensive stereotypes, including absent fathers, being on welfare and looters at a store. These cases highlight errors in Meta’s enforcement of its Hate Speech and Bullying and Harassment policies. After the Board brought these appeals to Meta’s attention, the company reversed its original decisions and removed the posts.

About the Cases

In late 2023, the Board received three separate appeals regarding three different images posted on Facebook, all containing negative material about African Americans.

In the first post that was viewed about 500,000 times, a user posted a computer-generated image of a store on fire with Black people shown as cartoon characters, wearing hooded sweatshirts, carrying merchandise and running out of the store. The name of the store, Target, has been changed to “Loot” in the image and in the accompanying caption, the user describes the image as the next Pixar movie.

The second post features a computer-generated image that also imitates a movie poster, with a Black woman who has exaggerated physical features shown holding a shopping cart full of Cheetos. The title of the movie is “EBT,” which is the name of a system for receiving social welfare benefits in the United States. At the top of the poster, in place of the names of actors, are the names Trayvon Martin and George Floyd, both African American victims of violence, one shot by an armed vigilante in 2012 and one killed at the hands of the police in 2020. Their deaths helped spark protests about racial disparities in the U.S. justice system.

The third post, which was viewed about 14 million times, features a meme claiming that “Adobe has developed software that can detect photoshop in an image.” Underneath the claim, there is an image of a woman with colorful markings over her entire face (typically used to show heat detection) to imply that parts of the image have been altered. This is contrasted against an image of a Black family having a meal in which the father and food on the table have the same colorful markings, implying these two elements were added through editing. The post reinforces the widespread negative stereotype about the lack of a father figure in Black families in the United States, which stems from a complex history of systemic racism and economic inequality.

Meta initially left all three posts on Facebook, despite appeals from users. In their appeals to the Board, those same users argued that the content depicted harmful racial stereotypes of African Americans.

After the Board brought these cases to Meta’s attention, the company determined that each post violated the Hate Speech Community Standard, which bans direct attacks against people on the basis of protected characteristics, including race and ethnicity. The policy also specifically prohibits “targeting a person or a group” with “dehumanizing speech” in the form of “comparisons to criminals including thieves,” “mocking the concept, events or victims of hate crimes” and “generalizations that state inferiority in ... moral characteristics.”

Additionally, Meta determined the second post that includes the names of Trayvon Martin and George Floyd also violated the Bullying and Harassment Community Standard under which the company removes “celebration or mocking of death or medical condition” of anyone. The company explained that “the image includes the name of two deceased individuals, Trayvon Martin and George Floyd ... The content trivializes their deaths by implying they will star in a fictitious animated movie.”
The company therefore removed all three posts.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user who reported content that was left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Significance of Cases

These cases highlight three instances in which Meta failed to effectively enforce its policies against Hate Speech and Bullying and Harassment, by leaving up violating posts despite user complaints. Two of the posts received a high number of views. Such moderation errors of under-enforcement can negatively impact people of protected-characteristic groups and contribute to an environment of discrimination. The Board has made compelling Meta to address Hate Speech Against Marginalized Groups a strategic priority.

In 2022, the Board issued a recommendation that Meta should “clarify the Hate Speech Community Standard and guidance provided to reviewers, explaining that even implicit references to protected groups are prohibited by the policy when the reference could be reasonably understood,” ( Knin Cartoon, recommendation no. 1), which Meta reported partial implementation on.

Decision

The Board overturns Meta’s original decisions to leave up the three posts. The Board acknowledges Meta’s correction of its initial errors once the Board brought these cases to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions