Girls’ Education in Afghanistan

A user appealed Meta’s decision to remove a Facebook post discussing the importance of educating girls in Afghanistan.

Type of Decision


Policies and Topics

Children / Children's rights, Discrimination, Sex and gender equality
Community Standard
Dangerous individuals and organizations





This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post discussing the importance of educating girls in Afghanistan. This case highlights an error in the company’s enforcement of its Dangerous Organizations and Individuals policy. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In July 2023, a Facebook user in Afghanistan posted text in Pashto describing the importance of educating girls in Afghanistan. The user called on people to continue raising their concerns and noted the consequences of failing to take these concerns to the Taliban. The user also states that preventing access to education for girls will be a loss to the nation.

Meta originally removed the post from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company removes content that “praises,” “substantively supports” or “represents” individuals and organizations it designates as dangerous, including the Taliban. The policy allows content that discusses a dangerous organization or individual in a neutral way or that condemns its actions.

After the Board brought this case to Meta’s attention, the company determined that the content did not violate the Dangerous Organizations and Individuals policy, and that the removal of the post was incorrect. The company then restored the content.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights mistakes in enforcement of Meta’s Dangerous Organizations and Individuals policy, which can have a negative impact on users’ capacities to share political commentary on Meta’s platforms. Here, this was specifically discussion of women’s education in Afghanistan after the Taliban takeover.

In a previous case, the Board recommended that Meta “add criteria and illustrative examples to its Dangerous Organizations and Individuals policy to increase understanding of exceptions for neutral discussion, condemnation and news reporting,” ( Shared Al Jazeera Post decision, recommendation no. 1). Meta reported in its Q2 2023 quarterly update that this recommendation had been fully implemented. Furthermore, the Board recommended that Meta “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta has reported that it is implementing this recommendation but has not published information to demonstrate implementation.


The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board also urges Meta to speed up the implementation of still-open recommendations to reduce such errors.

Return to Case Decisions and Policy Advisory Opinions