Planet of the Apes racism

A user appealed Meta’s decision to leave up a Facebook post that likens a group of Black individuals involved in a riot in France to the “Planet of the Apes.”

Type of Decision


Policies and Topics

Discrimination, Marginalized communities, Race and ethnicity
Community Standard
Hate speech





This is a summary decision. Summary decisions examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not consider public comments, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case summary

A user appealed Meta’s decision to leave up a Facebook post that likens a group of Black individuals involved in a riot in France to the “Planet of the Apes.” After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

Case description and background

In January 2023, a Facebook user posted a video that appears to have been taken from a car driving at night. The video shows the car driving through neighborhoods until a group of Black men appear and are seen chasing the car towards the end of the footage. The caption states in English that, “France has fell like planet of the friggin apes over there rioting in the streets running amok savages” and writes about how “the ones” that make it to "our shores” are given housing for what the user believes to be at a significant cost. The post had under 500 views. A Facebook user reported the content.

Under Meta’s Hate Speech policy, the company removes content that dehumanizes people belonging to a designated protected characteristic group by comparing them to “insects” or “animals in general or specific types of animals that are culturally perceived as intellectually or physically inferior (including but not limited to: Black people and apes or ape-like creatures; Jewish people and rats; Muslim people and pigs; Mexican people and worms).”

Meta initially left the content on Facebook. After the Board brought this case to Meta’s attention, the company determined that the content violated the Hate Speech Community Standard and its original decision to leave up the content was incorrect. Meta explained to the Board that the caption for the video violated its Hate Speech policy by comparing the men to apes and should have been removed. The company then removed the content from Facebook.

Board authority and scope

The Board has authority to review Meta's decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case significance

This case highlights difficulties in Meta’s consistent enforcement of its content policies. Meta has a specific provision within its Hate Speech policy prohibiting comparisons of Black people to apes, and yet it still failed to remove the content in this case. Other cases taken by the Board have examined Meta’s Hate Speech policies and contextual factors in determining whether that speech involved qualified or unqualified behavioral statements about a protected group for legitimate social commentary. The content in this case, however, unequivocally uses dehumanizing hate speech for the express purpose of denigrating a group of individuals based on their race and should have been removed. The case also underscores how problems with enforcement may result in content remaining on Meta’s platforms, which discriminates against a group of people based on their race or ethnicity. The Board notes that this type of content, at scale, contributes to the further marginalizing of visible minority groups and even potentially leads to offline harm, particularly in regions where there is existing animosity towards immigrants.

Previously, the Board has issued a recommendation that emphasized the importance of moderators recognizing nuance in Meta’s Hate Speech policy. Specifically, the Board recommended that “Meta should clarify the Hate Speech Community Standard and the guidance provided to reviewers, explaining that even implicit references to protected groups are prohibited by the policy when the reference would reasonably be understood” ( Knin cartoon decision, recommendation no. 1). Partial implementation of this recommendation by Meta has been demonstrated through published information. The Board underlines the need for Meta to address these concerns to reduce the error rate in moderating hate speech content.


The Board overturns Meta's original decision to leave up the content. The Board acknowledges Meta's correction of its initial error once the Board brought the case to Meta's attention.

Return to Case Decisions and Policy Advisory Opinions