Overturned

Revolutionary Armed Forces of Colombia (FARC) Dissidents Video

A user appealed Meta’s decision to leave up a video posted on Facebook that depicts Estado Mayor Central (EMC), a conglomerate of dissident factions of the Revolutionary Armed Forces of Colombia (FARC).

Type of Decision

Summary

Policies and Topics

Topic
Violence, War and conflict
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Colombia

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to leave up a video posted on Facebook that depicts Estado Mayor Central (EMC), a conglomerate of dissident factions of the Revolutionary Armed Forces of Colombia (FARC, after the Spanish acronym). After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the post.

About the Case

In September 2024, a Facebook user posted a video depicting Estado Mayor Central (EMC), a conglomerate of dissident factions from the Revolutionary Armed Forces of Colombia (FARC), a rebel group that fought against the Colombian government from 1964 to 2016. The video contains footage of military training, active military operations and a text overlay referencing killings attributed to the group. An overlay image displays FARC's logo. After a peace deal with the Colombian government in 2016, FARC reformed into a legal political party. Despite this, dissidents from FARC’s new political leadership, including factions that are part of EMC, continue to engage in violence, including fighting the government.

Under its Dangerous Organizations and Individuals policy, Meta removes content that glorifies, supports, represents or positively references dangerous organizations that “proclaim a violent mission or are engaged in violence.” The policy allows for “neutral discussions,” such as “factual statements, commentary, questions, and other information that do not express positive judgment around the designated dangerous organization.” In such instances, the company requires a clear indication of intent and defaults to removing content in case a user's intention is ambiguous or unclear.

After the Board brought this case to Meta’s attention, the company found the video appears to be official propaganda for FARC dissident factions that rejected the peace process, as it displays the logo of factions that continue to engage in violent activities. Dissident groups from FARC are designated under Meta’s Dangerous Organizations and Individuals policy. The depiction of combatants training and carrying the wounded suggests the group produced this imagery. Sharing propaganda materials produced by designated groups outside of an allowable context such as “social and political discourse”, that includes users “reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities,” can be understood as a means of support for FARC dissident factions. Hence, the company removed the content from Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user who reported content that was then left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights an instance of Meta underenforcing its Dangerous Organizations and Individuals policy, specifically, a video promoting FARC dissidents engaged in violence in Colombia. The Board has already expressed concern in relation to Meta’s automated detection failing to flag content associated with the Rapid Support Forces (RSF), an entity not allowed to have a presence on the company’s platforms, in the Sudan’s Rapid Support Forces Video Captive decision. In that case, the Board issued a recommendation regarding the enforcement's accuracy of Meta’s Dangerous Organizations and Individuals policy. It called on the company “to enhance its automated detection and prioritization of content potentially violating the Dangerous Organizations and Individuals policy for human review, Meta should audit the training data used in its video content understanding classifier to evaluate whether it has sufficiently diverse examples of content supporting designated organizations in the context of armed conflicts, including different languages, dialects, regions and conflicts” (recommendation no. 2). Meta reported progress on this recommendation.

Separately, the Board has recommended that “to improve the transparency of its designated entities and events list, Meta should explain in more detail the procedure by which entities and events are designated. It should also publish aggregated information on its designation list on a regular basis, including the total number of entities within each tier of its list, as well as how many were added and removed from each tier in the past year,” in Referring to Designated Dangerous Individuals as “Shaheed” (recommendation no. 4). The company stated it is working to update the Transparency Center to provide a more detailed explanation of its process for designating and de-designating entities and events.

The Board believes that full implementation of both recommendations would, respectively, contribute to decreasing the number of enforcement errors and provide users with greater clarity about potential violations by their content under the Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions