Multiple Case Decision
Press Release by the Oromo Liberation Front
February 24, 2026
Separately, two users appealed decisions by Meta to remove a post of theirs each, both of which shared a press release from the Oromo Liberation Front (OLF), a political party advocating for the self-determination of the Oromo people in Ethiopia.
2 cases included in this bundle
FB-KW5ZYLJ8
Case about dangerous individuals and organizations on Facebook
FB-A41GXKK9
Case about dangerous individuals and organizations on Facebook
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
Separately, two users appealed decisions by Meta to remove a post of theirs each, both of which shared a press release from the Oromo Liberation Front (OLF), a political party advocating for the self-determination of the Oromo people in Ethiopia. After the Board brought the appeals to Meta’s attention, the company reversed its decisions and restored both posts.
About the Cases
The Board reviewed two pieces of content sharing a press release from the OLF. In both cases, two different Facebook users shared a press release and caption in the Oromo language saying the OLF's head office in Ethiopia's capital Addis Ababa in June 2025 had been officially returned to the party. In the press release, the OLF described the event as the result of prolonged efforts and expressed gratitude to all institutions and members who supported the process.
OLF was established in 1973 as a political organization advocating for the self-determination of the Oromo people, Ethiopia’s largest ethnic group. Historically, the OLF had engaged in armed struggles against the Ethiopian government, which designated it as a terrorist group. However, in 2018, the Ethiopian government removed OLF from its terrorist list, allowing its leaders to return from exile. A splinter faction which had served as the armed wing of the OLF refused to disarm and continued its armed resistance against the federal government. In 2021, Ethiopian authorities designated the faction as a terrorist organization.
Under its Dangerous Organizations and Individuals policy, the company removes “glorification, support and representation” of individuals or organizations that “proclaim a violent mission or are engaged in violence.”
In their appeals to the Board, the users stated that the posts share a press release from the OLF regarding the re-opening of its headquarters in Addis Ababa and do not violate Meta's policies. They emphasized that OLF is a legally registered political party recognized by the National Electoral Board of Ethiopia.
After the Board brought these cases to Meta's attention, the company determined that the posts refer to the Oromo Liberation Front, distinct from a similarly named organization designated as a Tier 2 violent non-state actor under the Dangerous Organizations and Individuals policy. According to Meta, since the posts did not pertain to a designated entity, they did not violate the policy. Therefore, Meta’s original decisions to remove the posts were incorrect. The company then restored both pieces of content to Facebook.
Board Authority and Scope
The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Cases
These cases highlight the continued careless enforcement of Meta's Dangerous Organizations and Individuals Community Standard, even against content that does not reference a designated entity. Such enforcement errors, which infringe on users' freedom of expression, are particularly concerning in a country like Ethiopia, where there is ongoing conflict and freedom of expression is restricted.
The Board has previously issued several recommendations on this topic. Among them, the Board recommended that Meta “explain the methods [the company] uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy” ( Referring to Designated Individuals as Shaheed, recommendation no. 6). The Board considers this recommendation to have been reframed by Meta. The company stated that it conducts audits to assess the accuracy of its content moderation decisions and that these audits inform areas for improvement. Meta, however, did not explain and declined to make public the methods it uses to make these assessments, which would further improve the company’s enforcement transparency, including when applied to different regions, markets and languages.
Furthermore, to improve overall transparency around the designation and de-designation of entities, the Board has recommended: “Meta should explain in more detail the procedure by which entities and events are designated. It should also publish aggregated information on its designation list on a regular basis, including the total number of entities within each tier of its list, as well as how many were added and removed from each tier in the past year” ( Referring to Designated Individuals as Shaheed, recommendation no. 4). Meta has partially implemented this recommendation. In November 2024, Meta published information on how and why the company designates organizations or individuals and how and when it will consider their removal from the Dangerous Organizations and Individuals list. Meta has not, however, published data on the total number of entities within each tier on the list, or how many were removed from the list.
The Board also issued a recommendation focused on automated enforcement, asking Meta to “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes” ( Breast Cancer Symptoms and Nudity, recommendation no. 5). The Board considers that Meta did not address this recommendation, given that the company bundled it with recommendation no. 1 from the same case, which asked Meta to “improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer symptoms are not wrongly flagged for review.” While Meta reported improvements in the automated detection of images, the company did not engage with the internal audit procedure outlined by the Board in recommendation no. 5 ( Meta Q2 + Q3 2021 Update on the Oversight Board, p. 32).
The repeated overenforcement of the company’s Dangerous Organizations and Individuals policy, as highlighted in previous summary decisions (e.g., Link to Wikipedia Article on Hayat Tahrir al-Sham, Mention of Al-Shabaab, Anti-Colonial Leader Amílcar Cabral), undermines the ability of users to share news on Meta’s platforms. The Board believes that full implementation of these recommendations would further strengthen the company’s ability to improve enforcement accuracy and better protect user speech. Also, sharing additional information on the designation of entities would increase overall transparency and generate engagement with Meta that has the potential to lead to further improvements.
Decision
The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s correction of its initial errors once the Board brought the cases to Meta’s attention.