OVERTURNED
2023-047-FB-UA

Niger Coup Cartoon

A user appealed Meta’s decision to remove a Facebook post on the military coup in Niger.
OVERTURNED
2023-047-FB-UA

Niger Coup Cartoon

A user appealed Meta’s decision to remove a Facebook post on the military coup in Niger.
Policies and topics
News events, Politics, War and conflict
Hate speech
Region and countries
Subsaharan Africa
France, Niger
Platform
Facebook
Policies and topics
News events, Politics, War and conflict
Hate speech
Region and countries
Subsaharan Africa
France, Niger
Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post on the military coup in Niger. This case highlights errors in Meta’s content moderation, including its automated systems for detecting hate speech. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In July 2023, a Facebook user in France posted a cartoon image showing a military boot labeled “Niger,” kicking a person wearing a red hat and dress. On the dress is the geographical outline of Africa. Earlier in the same month, there was a military takeover in Niger when General Abdourahamane Tchiani, with the help of the presidential guard of which he was head, ousted President Mohamed Bazoum, and declared himself leader of the country.

Meta originally removed the post from Facebook, citing its Hate Speech policy, under which the company removes content containing attacks against people on the basis of a protected characteristic, including some depictions of violence against these groups.

After the Board brought this case to Meta’s attention, the company determined that the content did not violate the Hate Speech policy and its removal was incorrect. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights inaccuracies in Meta’s moderation systems that detect hate speech. The Board has issued recommendations on improving automation and transparency, including urging Meta to "implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes," ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta has reported that it is implementing this recommendation but has not yet published information to demonstrate implementation.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board also urges Meta to speed up the implementation of still-open recommendations to reduce such errors.

Policies and topics
News events, Politics, War and conflict
Hate speech
Region and countries
Subsaharan Africa
France, Niger
Platform
Facebook
Policies and topics
News events, Politics, War and conflict
Hate speech
Region and countries
Subsaharan Africa
France, Niger
Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post on the military coup in Niger. This case highlights errors in Meta’s content moderation, including its automated systems for detecting hate speech. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In July 2023, a Facebook user in France posted a cartoon image showing a military boot labeled “Niger,” kicking a person wearing a red hat and dress. On the dress is the geographical outline of Africa. Earlier in the same month, there was a military takeover in Niger when General Abdourahamane Tchiani, with the help of the presidential guard of which he was head, ousted President Mohamed Bazoum, and declared himself leader of the country.

Meta originally removed the post from Facebook, citing its Hate Speech policy, under which the company removes content containing attacks against people on the basis of a protected characteristic, including some depictions of violence against these groups.

After the Board brought this case to Meta’s attention, the company determined that the content did not violate the Hate Speech policy and its removal was incorrect. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights inaccuracies in Meta’s moderation systems that detect hate speech. The Board has issued recommendations on improving automation and transparency, including urging Meta to "implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes," ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta has reported that it is implementing this recommendation but has not yet published information to demonstrate implementation.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board also urges Meta to speed up the implementation of still-open recommendations to reduce such errors.