Overturned

Hotel in Ethiopia

A user appealed Meta's decision to leave up a Facebook post that called for a hotel in Ethiopia's Amhara region to be burned down. This case highlights Meta's error in enforcing its policy against a call for violence in a country experiencing armed conflict and civil unrest. After the Board brought the appeal to Meta's attention, the company reversed its original decision and removed the post.

Type of Decision

Summary

Policies and Topics

Topic
Violence, War and conflict
Community Standard
Violence and incitement

Region/Countries

Location
Ethiopia

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not consider public comments, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas where the company could improve its policy enforcement.

Case summary

A user appealed Meta's decision to leave up a Facebook post that called for a hotel in Ethiopia's Amhara region to be burned down. This case highlights Meta's error in enforcing its policy against a call for violence in a country experiencing armed conflict and civil unrest. After the Board brought the appeal to Meta's attention, the company reversed its original decision and removed the post.

Case description and background

On April 6, 2023, a Facebook user posted an image and caption that called for a hotel in Ethiopia's Amhara region to be burned down. The user claimed that the hotel was owned by a general in the Ethiopian National Defense Forces. The post also included a photograph of the hotel, its address, and the name of the general.

The user posted this content during a period of heightened political tension in the Amhara region when protests had been taking place for several days against the government's plan to dissolve a regional paramilitary force.

Under Meta's Violence and Incitement policy, the company removes content that calls for high-severity violence. In their appeal to the Board, the user who reported the content stated that the post calls for violence and violates Meta's Community Standards.

Meta initially left the content on Facebook. When the Board brought this case to Meta’s attention, it determined that the post violated its Violence and Incitement policy, and that its original decision to leave up the content was incorrect. The company then removed the content from Facebook.

Board authority and scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case significance

This case highlights Meta's error in enforcing its policy against a call for violence in a country experiencing armed conflict and civil unrest. Such calls for violence pose a heightened risk of near-term violence and can exacerbate the situation on the ground. That is why the Board recommended Meta “assess the feasibility of establishing a sustained internal mechanism that provides the expertise, capacity and coordination required to review and respond to content effectively for the duration of a conflict,” ( Tigray Communication Affairs Bureau, recommendation no. 2). Meta is in the process of launching a crisis coordination team to provide dedicated operations oversight throughout imminent and emerging crises. The Board will continue to follow implementation of the new mechanism together with existing policies, to ensure Meta treats users more fairly in affected regions.

The Board has also recommended that Meta commission an independent human rights, due diligence assessment on how Facebook and Instagram have been used to spread hate speech and unverified rumors that heighten the risk of violence in Ethiopia, and publish the report in full ( Alleged crimes in Raya Kobo, recommendation no. 3). Meta described this recommendation as work it already does but did not publish information to demonstrate implementation.

Decision
The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions