This website is currently undergoing maintenance and will be back soon.

Overturned

Libya Floods

A user appealed Meta’s decision to remove a Facebook post discussing recent floods in Libya. This case highlights over-enforcement of the company's Dangerous Organizations and Individuals policy.

Type of Decision

Summary

Policies and Topics

Topic
Freedom of expression, Journalism, Natural disasters
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Libya

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public of the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post discussing recent floods in Libya. In September 2023, there were devastating floods in northeast Libya caused by Storm Daniel and the collapse of two dams. A video in support of the victims of the floods, especially in the city of Derna, was removed for violating Meta’s Dangerous Organizations and Individuals policy. This case highlights an over-enforcement of the company's Dangerous Organizations and Individuals policy, which adversely impacts users’ freedom to express solidarity and sympathy in difficult situations. After the Board brought the appeal to Meta’s attention, the company reversed its earlier decision and restored the post.

Case Description and Background

In September 2023, a Facebook user posted a video containing two images without a caption. The background image showed two individuals in military uniform with badges. One of the badges had Arabic text that read “Brigade 444 – Combat.” This image was overlaid with the second one that depicted two people pulling a third person out from a body of water. The people on the sides had the Arabic words for “west” and “south” on their chests, while the person in the middle had the word “east.”

In August 2023, armed clashes broke out in Tripoli between the 444th Combat Brigade and the Special Deterrence Force. These are two of the militias vying for power since the 2011 overthrow of Muammar Gaddafi. In their submission to the Board, the user stated that they posted the video to clarify that Libya was “one people” with “one army” supporting the north-eastern city of Derna after the flooding that resulted from dam collapses following Storm Daniel in September 2023.

Meta originally removed the post from Facebook, citing its Dangerous Organizations and Individuals policy.

After the Board brought this case to Meta’s attention, the company determined that its removal was incorrect and restored the content to Facebook. The company told the Board that the content did not contain any references to a designated organization or individual and therefore did not violate Meta’s policies.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights the over-enforcement of Meta's Dangerous Organizations and Individuals policy, including through automated systems, which can have a negative impact on users’ freedom of expression in sharing commentary about current events on Meta’s platforms.

In the case of Öcalan's Isolation, the Board has recommended that Meta “evaluate automated moderation processes for enforcement of the Dangerous Organizations and Individuals policy," (recommendation no. 2). Meta reported that it would take no action on this recommendation as "the policy guidance in this case does not directly contribute to the performance of automated enforcement."

In terms of automation, the Board has urged Meta to implement an internal audit procedure to continually analyze a statistically representative sample of automated removal decisions to reverse and learn from enforcement mistakes ( Breast Cancer Symptoms and Nudity, recommendation no. 5). Meta has reported implementing this recommendation but has not published information to demonstrate complete implementation. As of Q4 2022, Meta reported having "completed the global roll out of new, more specific messaging that lets people know whether automation or human review led to the removal of their content from Facebook," but did not provide information as evidence of this. In the same decision, the Board also recommended that Meta "expand transparency reporting to disclose data on the number of automated removal decisions per Community Standard, and the proportion of those decisions subsequently reversed following human review," ( Breast Cancer Symptoms and Nudity, recommendation no. 6). As of Q3 2023, Meta reported that it was establishing a consistent accounting methodology for such metrics. In the case of Punjabi Concern Over the RSS in India, the Board urged Meta to "improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard," (recommendation no. 3). As of Q3 2023, Meta reported that it was working to define its accuracy metrics, alongside its work on recommendation no. 6 in Breast Cancer Symptoms and Nudity.

The Board reiterates that full implementation of its recommendations will help to decrease enforcement errors under the Dangerous Organizations and Individuals policy, reducing the number of users whose freedom of expression is infringed by wrongful removals.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention. The Board emphasizes that full adoption of these recommendations, along with the published information to demonstrate successful implementation, could reduce the number of enforcement errors under the Dangerous Organizations and Individuals policy on Meta's platforms.

Return to Case Decisions and Policy Advisory Opinions