Case Description
In January 2025, four posts in Somali were published on a Facebook page, discussing Somaliland politics. Somaliland self-declared its independence from Somalia in 1991. No country has recognized its statehood.
The Facebook page describes itself as belonging to freelance journalism and has about 90,000 followers. It is not part of Meta’s cross-check program to prevent enforcement errors, that also includes additional levels of review for certain entities, including journalistic and civic entities. The four posts describe and discuss recent socio-political events concerning Somaliland.
Two of the posts are about Somaliland President Abdirahman Mohamed Abdillahi’s recent foreign policy engagements. The posts include photos of a foreign trip with captions stating that media coverage was prohibited. Two other posts relate to a public, official ceremony in Somaliland and a political conference, with descriptive captions.
Two users reported the page under Dangerous Organizations and Individuals and Hateful Conduct policies, and it was enqueued for a review. When a page is enqueued for review, Meta evaluates key elements of the page, such its name, bio details and cover photo, and its posts. A human reviewer found the page violated the Hateful Conduct policy and it was “unpublished” (a measure similar to account deactivation). None of the four posts were reported, but each was removed for individually violating the Hateful Conduct policy.
After the Board selected the cases, the company reviewed its initial decision to unpublish this page and also determined that it incorrectly removed all four posts. Consequently, Meta restored all posts, re-published the page and reversed the strike against the posting user’s account and page.
In their appeal to the Board, the posting user stated that their intention was to share information, not to attack or discriminate against any individual or group and that their posts did not violate the Hateful Conduct policy.
The Board selected these enforcement errors to examine the impacts of Meta’s moderation on media freedom in the Horn of Africa, in the context of its approach to the governance of pages. These cases fall within the Board’s Elections and Civic Space priority.
The Board would appreciate public comments that address:
- Media freedom and safety of journalists in Somaliland, the role of social media and the situation for freedom of expression.
- Challenges in preventing wrongful enforcement against journalistic content, pages and accounts, especially in non-English speaking regions where freedom of expression is heavily restricted.
- Good practices for ensuring access to adequate remedies for journalists and media organizations locked out of pages or accounts as a result of wrongful enforcement.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Tuesday 15 July.
What’s Next
Over the next few weeks, Board Members will be deliberating these cases. Once they have reached their decision, we will post it on the Decisions page.
Comments
1) Both Somalia and Somaliland, journalists operate under significant pressure. They face frequent detentions, arrests, media raids, and restrictions on their online platforms especially during coverage of sensitive political or security incidents. The Somali Journalists Syndicate (SJS) 2022 Annual Report documented that authorities imposed severe restrictions on access to information, detained journalists, suspended media outlets, and enforced internet blackouts to suppress reporting—particularly around elections and public security. These restrictions have created an increasingly hostile environment for the press, where independent journalism is under threat and freedom of expression is systematically undermined. And there were another pressure from media bosses who usually get bribes from the politicians for favorable reporting. This has compromised many journalists who left the mainstream media and started their own social media platforms as a way of media reporting both in Somaliland and Somalia.
2) Facebook is the major platform for news and information sharing in Somaliland and Somalia, enabling direct engagement with audiences. I can describe it is the single largest platform for journalists and local communities. If something happens in Somalia or Somaliland, you will first get it on Facebook. However, content posted in Somali language is often misinterpreted by Facebook's automated moderation systems lacking understanding of the local context. SJS has documented how this leads to wrongful take downs of pages, accounts, restrictions, and sometimes locking of accounts, which further stifle independent voices in regions already limiting freedom of expression.
3) Journalists in Somalia and Somaliland face growing difficulties when using social media platforms like Facebook, not just from state repression, but also from misapplication of platform rules that disproportionately target legitimate reporting. First of all, Facebook’s content moderation systems operate largely in English, and automated filters often misclassify Somali-language posts as violations (this includes local dialects such May language which some media use for their reporting). As a result, legitimate journalism—including human rights reporting, security incidents, or coverage of political developments—is wrongly taken down due to a lack of local language understanding and contextual awareness.
A particularly alarming trend occurred in November 2024, when a large group of Somali journalists and non-journalistic users were suddenly locked out of their accounts after coordinated reports falsely claimed they were deceased. Facebook’s system automatically “memorialized” these accounts—labeling them as belonging to people who had "died"—despite the users being alive and active. This tactic was used to silence journalists and block reporting without any actual violation of platform rules.
Also, another major challenge is the misapplication of Facebook’s “Dangerous Organizations and Individuals” (DOI) policy. This rule, intended to block extremist content, has in practice been used against journalists, especially when they report on or interview individuals from conflict areas. In 2023, an interview I gave was removed under DOI policy—despite being me talking about incidents that happened in Somalia as a legitimate piece of reporting. This example shows how vague interpretations of DOI rules can result in wrongful censorship of critical journalism.
Journalists frequently face sudden take downs of posts, videos, or entire accounts—with no clear explanation. Appeals are routed through automated forms, and responses are often generic or dismissive. Many users never receive feedback or a resolution, reinforcing the sense that enforcement lacks fairness, transparency, or accountability.
Due to slow and often ineffective appeal mechanisms, many journalists are left without access to their content or audiences for extended periods. This leads to self-censorship, fear of further platform restrictions, and frustration among media workers who rely on Facebook to reach communities and international audiences.
5) To address the systemic challenges media freedom on digital platforms such as Meta's platforms, several key practices should be adopted. First, social media companies must invest in localized, language-specific, and context-aware content moderation for the Somali language. Automated moderation systems often lack the nuance to distinguish between harmful content and legitimate journalism, especially in politically sensitive or conflict-prone regions. Hiring or consulting with local and neutral experts can ensure decisions are better informed by real-world context.
Second, social media companies should establish dedicated appeals mechanisms for journalists and media outlets. These processes must be fast, transparent, and accessible—particularly for journalism, human rights groups who are often penalized due to vague enforcement. A clear and fair appeals system can serve as a crucial safeguard against wrongful content removal and account suspensions.
In addition, collaboration with press freedom organizations can help platforms identify recurring enforcement issues, develop protective protocols for journalists, and respond effectively to abuse. These partnerships also build local trust and allow civil society voices to shape digital policy.
Platforms such as Meta should also commit to publishing transparency reports that include region-specific data, particularly on content removals affecting media professionals and tell whether governments have requested any take downs. These reports can help build accountability and offer insight into the scale and patterns of wrongful enforcement, especially in high-risk countries like Somalia.
Finally, I think there is a need to expand digital rights and safety training for journalists and women groups. These programs should include guidance on navigating platform rules, protecting digital assets, and preparing for potential take downs. Strengthening digital literacy and resilience among journalists will ensure they can continue their work safely, even in the face of online censorship or targeted attacks.
Thank you!