Overturned

Journalistic Video on Somalia’s Future

A user appealed Meta’s decision to remove a video featuring a clip of a news report about an international conference on Somalia.

Type of Decision

Summary

Policies and Topics

Topic
Journalism
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Somalia, United Kingdom

Platform

Platform
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to remove a video featuring a clip of a news report about an international conference on Somalia. The clip includes a journalist asking questions to people in Somalia and footage of leaders and flags of the armed group al-Shabaab. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

About the Case

In May 2025, a Facebook user who is a verified journalist posted a video featuring a news report clip along with the question, “Who gets to decide Somalia’s future?” The caption also included references to the 2012 London Conference on Somalia. Both the video and the caption are in English. The original video was aired in 2013 on Channel 4, a British public broadcast television channel.

In the video, he discusses the controversies surrounding the conference on Somalia, a major international meeting convened by the government of the United Kingdom to discuss the African country’s future. The caption notes that the conference excluded al-Shabaab, noting that the group that had been working to overthrow the government in Somalia and that it “controlled most of the country” at that time. In the video, the journalist highlights the metaphorical distance between Western leaders and people in Somalia, pointing out that many locals when shown pictures of key UK leaders discussing the country’s future did not recognize them. The video also contains footage of al-Shabaab leaders and flags.

Harakat al-Shabaab al-Mujahideen, commonly known as al-Shabaab, is an Islamist insurgent group with links to al-Qaeda. The group mainly operates in Somalia and has carried out several attacks in neighboring countries. Al-Shabaab has been designated as a dangerous organization under Meta's Dangerous Organizations and Individualspolicy.

Under its Dangerous Organizations and Individuals policy, the company removes “glorification, support and representation” of individuals or organizations that “proclaim a violent mission or are engaged in violence.” However, the policy allows for “content reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities” in the context of “social and political discourse.” According to the policy, “news reporting includes information that is shared to raise awareness about local and global events in which designated dangerous organizations and individuals are involved.”

In his appeal to the Board, the user stated that he reports on “important issues to inform the public” and that this report had been aired on a TV channel following national guidelines. The user highlighted that “journalism is not a crime.”

After the Board brought this case to Meta’s attention, the company reversed its original decision. Meta determined that the content is a news report that questions the UK government’s exclusion of al-Shabaab from a conference on the future of Somalia, given that al-Shabaab controls large parts of the country. Meta also mentioned that the footage of al-Shabaab's leaders and flags “is presented with appropriate editorial intervention and does not contain any glorification or support” of the group. Meta concluded that its initial removal was incorrect, as the post qualifies for the Dangerous Organizations and Individuals policy’s exception for social and political discourse. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights the continued overenforcement of Meta's Dangerous Organizations and Individuals Community Standard, including against content posted by a verified journalist and aired on a mainstream broadcast service that falls under the policy exception for social and political discourse. Enforcement errors like this one may not only impact the posting users’ freedom of expression but also other users’ access to information, especially in countries in conflict, where this is critical.

The Board has previously issued a decision involving two posts that referenced al-Shabaab. In those cases, Meta made errors by removing content under its Dangerous Organizations and Individuals policy despite the posts clearly reporting on - in the first case - and condemning - in the second case - a designated entity. The Board has also issued several recommendations regarding the enforcement of Meta's Dangerous Organizations and Individuals policy. These include, amongst others, a recommendation for the company to “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Individuals and Organizations policy in order to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting, recommendation no. 5). In November 2023, Meta reported implementing an update to the Dangerous Organizations and Individuals policy, which included details about how the company “approaches news reporting as well as neutral and condemning discussion.” Meta also reported “tracking the accuracy of enforcement of the social and political discourse carveout” ( Meta’s Q3 2023 Quarterly Update on the Oversight Board). However, Meta did not share results with the Board. The Board thus considers that Meta did not publish information to demonstrate the implementation of this recommendation.

Furthermore, in one of its policy advisory opinions, the Board recommended that Meta “explain the methods it uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy” ( Referring to Designated Individuals as Shaheed, recommendation no. 6). The Board considered that this recommendation was reframed by Meta. The company stated that it conducts audits to assess the accuracy of its content moderation decisions and that these audits inform areas for improvement. Meta, however, did not explain the methods it uses to perform these assessments, which would further improve transparency of its enforcement, including around differences between markets and languages. Meta declined to make these methods public.

The Board also issued a recommendation focused on automated enforcement, asking Meta to “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes” ( Breast Cancer Symptoms and Nudity, recommendation no. 5). The Board considers that Meta did not address this recommendation, given that the company bundled it with another recommendation without engaging with the internal audit procedure outlined by the Board.

The repeated overenforcement of the company’s Dangerous Organizations and Individuals policy undermines the ability of users to post and share news reporting and information about designated organizations, infringing on users' freedom of expression. The Board’s cases suggest that errors of this sort are all too frequent. In this case, the Board also considered the fact that this video was from a verified journalist and had been aired on a mainstream broadcasting service from the United Kingdom in 2013. Meta should make reducing such errors a high priority. Full implementation of the recommendations mentioned above could reduce the number of incorrect removals under the company’s Dangerous Organizations and Individuals policy.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions