This website is currently undergoing maintenance and will be back soon.

Overturned

Cartoon Showing Taliban Oppression Against Women

A user appealed Meta’s decision to remove a Facebook post containing a political cartoon illustrating Afghan women’s oppression under the Taliban regime. This case highlights errors in Meta’s enforcement of its Dangerous Organizations and Individuals policy.

Type of Decision

Summary

Policies and Topics

Topic
Art / Writing / Poetry, Freedom of expression, Journalism
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Afghanistan, Netherlands

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not consider public comments and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post containing a political cartoon illustrating Afghan women’s oppression under the Taliban regime. This case highlights errors in Meta’s enforcement of its Dangerous Organizations and Individuals policy, specifically in the context of political discourse delivered through satire. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In August 2023, a Facebook user, a professional cartoonist from the Netherlands, posted a cartoon showing three Taliban men seated on a car crusher with a group of distressed women beneath it. In the background, there's a meter labelled "oppress-o-meter" connected to a control panel, and one of the men is seen pressing a button, causing the crusher to lower. The caption accompanying this image reads: "2 years of Taliban rule. #Afghanistan #Taliban #women #oppression." The post was removed for violating Meta’s Dangerous Organizations and Individuals policy, which prohibits representation of and certain speech about the groups and people the company judges as linked to significant real-world harm.

In their appeal to the board, the user indicated that the content was a political cartoon that was satirical in nature, commenting on the continued and worsening oppression of women in Afghanistan under Taliban rule. Meta’s Dangerous Organizations and Individuals policy allows content that reports on, condemns or neutrally discusses dangerous organizations or individuals or activities.

After the Board brought this case to Meta’s attention, the company determined that the content did not violate the Dangerous Organizations and Individuals policy and its removal was incorrect. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights flaws in Meta’s enforcement procedures, particularly when detecting and interpreting images associated with designated organizations and individuals. The over-enforcement of this policy could potentially lead, as it did in this case, to artistic expression linked to legitimate political discourse being removed.

In 2022, the Board also recommended that “Meta should assess the accuracy of [human] reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting, recommendation no. 5). Additionally, in the same decision ( Mention of the Taliban in News Reporting, recommendation no. 6), the Board stated that “Meta should conduct a review of the HIPO ranker [high-impact false positive override system] to examine if it can more effectively prioritize potential errors in the enforcement of allowances to the Dangerous Organizations and Individuals policy.” For both recommendations, Meta reported progress on implementation.

The Board has issued a recommendation that “Meta should ensure that it has procedures to analyze satirical content and context properly and that moderators are provided adequate incentives to investigate the context of potentially satirical content,” ( Two Buttons Meme, recommendation no. 3). Meta has reported implementation in part for this recommendation.

The Board emphasizes that full implementation of these recommendations could reduce the number of enforcement errors under Meta’s Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions