Dehumanizing speech against a woman

A user appealed Meta’s decision to leave up a Facebook post that attacked an identifiable woman and compared her to a motor vehicle (“truck”). After the Board brought the appeal to Meta's attention, the company reversed its original decision and removed the post.

Type of Decision


Policies and Topics

Sex and gender equality
Community Standard
Hate speech


United States



This is a summary decision. Summary decisions examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not consider public comments, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas of potential improvement in its policy enforcement.

Case summary

A user appealed Meta’s decision to leave up a Facebook post that attacked an identifiable woman and compared her to a motor vehicle (“truck”). After the Board brought the appeal to Meta's attention, the company reversed its original decision and removed the post.

Case description and background

In December 2022, a Facebook user posted a photo of a clearly identifiable woman. The caption above the photo, in English, referred to her as a pre-owned truck for sale. It continued to describe the woman using the metaphor of a “truck,” requiring paint to hide damage, emitting unusual smells, and being rarely washed. The user added that the woman was “advertised all over town.” Another user reported the content to the Board, saying that it was misogynistic and offensive to the woman. The post received over two million views, and it was reported to Meta more than 500 times by Facebook users.

Before Meta reassessed its original decision, the user who posted the content edited the original post to superimpose a “vomiting” emoji over the woman’s face. They updated the caption saying they had concealed her identity out of their embarrassment “to say that I owned this pile of junk.” They also added information naming various dating websites on which the woman supposedly had a profile.

Under Meta’s Bullying and Harassment policy, the company removes content that targets private figures with “[a]ttacks through negative physical descriptions” or that makes “[c]laims about sexual activity.”

Meta initially left the content on Facebook. When the Board brought this case to Meta’s attention, it reviewed both the original post and the updated post. The company noted that both versions of the content include a negative physical description of a private individual by comparing her to a truck and both make inferences about her sexual activity by claiming she is “advertised all over town,” though the edited post is more explicit with the references to dating websites. Therefore, Meta determined that both versions violated its Bullying and Harassment policy, and its original decision to leave up the content was incorrect. The company then removed the content from Facebook.

Board authority and scope

The Board has authority to review Meta's decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case significance

The case highlights a concern with how Meta fails to enforce its policy when a post contains bullying and harassment, which can be a significant deterrent to open online expression for women and other marginalized groups. Meta failed to remove the content which violates two elements of the Bullying and Harassment Community Standard, as an attack with “negative physical description” and “claims about sexual activity," despite the post receiving millions of views and hundreds of reports by Facebook users.

Previously, the Board issued a series of recommendations for Meta to clarify several points of ambiguity in its Bullying & Harassment policy (“ Pro-Navalny protest in Russia,” recommendations no. 1-4), half of which Meta implemented, and half of which the company declined after a feasibility assessment. The Board is concerned that this case may indicate a more widespread problem of underenforcement of the anti-bullying standard - which likely has disproportionate impacts on women and members of other vulnerable groups. The Board underlines the need for Meta to holistically address concerns the Board includes in its case decisions and implement relevant recommendations to reduce the error rate in moderating bullying content impacting all users, while balancing the company’s values of “Safety,” “Dignity,” and “Voice.”


The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions