قرار قضية متعددة

Reporting on Somaliland Current Affairs

The Oversight Board has found Meta’s systems have failed to safeguard independent journalism and public interest reporting in the self-declared Republic of Somaliland.

4 تم تضمين من الحالات في هذه الحزمة

أسقط

FB-79J73LS1

حالة بشأن خطاب يحض على الكراهية على فيسبوك

منصة
Facebook
عنوان
حرية التعبير,صحافة
معيار
السلوك الذي يحض على الكراهية
موقع
الصومال
Date
تم النشر بتاريخ تم النشر بتاريخ 30 تِشْرِين الْأَوَّل 2025
أسقط

FB-G8P83WBH

حالة بشأن خطاب يحض على الكراهية على فيسبوك

منصة
Facebook
عنوان
حرية التعبير,صحافة
معيار
السلوك الذي يحض على الكراهية
موقع
الصومال
Date
تم النشر بتاريخ تم النشر بتاريخ 30 تِشْرِين الْأَوَّل 2025
أسقط

FB-ETWR07NV

حالة بشأن خطاب يحض على الكراهية على فيسبوك

منصة
Facebook
عنوان
حرية التعبير,صحافة
معيار
السلوك الذي يحض على الكراهية
موقع
الصومال
Date
تم النشر بتاريخ تم النشر بتاريخ 30 تِشْرِين الْأَوَّل 2025
أسقط

FB-F91P3YE6

حالة بشأن خطاب يحض على الكراهية على فيسبوك

منصة
Facebook
عنوان
حرية التعبير,صحافة
معيار
السلوك الذي يحض على الكراهية
موقع
الصومال
Date
تم النشر بتاريخ تم النشر بتاريخ 30 تِشْرِين الْأَوَّل 2025

Summary

The Oversight Board has found Meta’s systems have failed to safeguard independent journalism and public interest reporting in the self-declared Republic of Somaliland. The Board analyzed the removal of a Facebook page and four pieces of content covering current affairs in Somaliland, a repressive region for journalists. The Board has overturned Meta’s decisions to take down the page and the four posts. Meta should improve its mistake prevention systems and appeal processes to ensure that journalists’ pages and their content are not wrongly removed.

About the Cases

The four cases the Board considered as part of this decision relate to a Facebook page that discusses news and events in Somaliland. The page describes itself as engaging in freelance journalism and has about 90,000 followers.

In January 2025, four posts were published on the page. Two posts are about Somaliland President Abdirahman Mohamed Abdullah’s recent foreign policy trips and include photos with captions stating that media coverage was prohibited. Two other posts relate to an official ceremony and a political conference in Somaliland, also with descriptive captions. The page, posts and captions were all in the Somali language.

After users reported the page, a human reviewer found that it violated Meta’s Hateful Conduct policy and it was “unpublished” i.e. the page was removed. The reviewer also removed the four posts for violating the same policy. The page administrator’s account received a strike.

The page administrator appealed Meta’s decision to remove the page and, separately, the removal of the four posts. The appeals covering the four posts were reviewed by six human reviewers, including the reviewer who made the original decisions, and the decisions were upheld. Meta’s systems did not prioritise the page decision for review and the appeal was automatically closed, with the page remaining unpublished. The page administrator then appealed to the Board.

When the Board selected the cases for review, Meta reversed all of its initial decisions, reinstating the page and four posts, and removing the strike. The Board identified 10 more appeals from Somaliland relating to content removals, which Meta confirmed were in error and reinstated.

Though considered by the Federal Republic of Somalia to be a constituent province, Somaliland declared its independence in 1991 but has not gained international recognition. Somalia is amongst the most dangerous countries in the world for journalists, including Somaliland, where authorities are repressive towards and put enormous pressure on local media, according to media freedom organizations.

Key Findings

The Board finds that there was no justification under any of Meta’s content policies to take down the Facebook page and posts. No elements of the page were violating and the removal was entirely arbitrary. Removing the content was inconsistent with Meta’s human rights responsibilities.

Digital platforms, such as Facebook, are essential spaces for independent journalists in Somaliland to disseminate news and connect with domestic and international audiences. Arbitrary removal of content has serious detrimental effects on freedom of expression in the region and inadvertently contributes to the hostile environment for journalists. Unpublishing pages can be highly consequential, particularly for journalists, and such decisions should get further review before enforcement.

The cross-check system, consisting of the General Secondary Review (GSR) and the Sensitive Entity Secondary Review (SSR), which should have flagged reviews in these cases, did not sufficiently prioritize the fact that the page was engaged in freelance reporting. Two other mistake prevention systems should have been activated but were either not used fully or at all.

Meta’s failure to include the page in its cross-check system points to a potential broader systemic problem. The Board is particularly concerned that the system to prevent overenforcement did not prioritize public interest journalists working in the Somali language. This is especially relevant following Meta’s announcement in January that its new approach to content moderation should have “more speech and fewer mistakes.” Meta’s Journalist Registration Program, which provides enhanced security protections, does not cover Somalia, including Somaliland, and could have helped in this case.

Meta lacks a single, centralized resource documenting its content policies covering pages raising clarity and transparency issues.

The Oversight Board’s Decision

The Board overturns Meta’s decisions to unpublish the page and remove the four pieces of content.

The Board also recommends that Meta:

  • Consolidate the rules and enforcement guidelines covering pages into a comprehensive and easily accessible resource in the Transparency Center.
  • Prohibit human reviewers who make an enforcement decision from reviewing any appeal on that decision.
  • Update its GSR ranking system to explicitly prioritize reviews of unpublication of pages.
  • Develop new criteria and systems to proactively enroll pages or accounts engaged in journalism in regions where media freedom is repressed based on authoritative sources like the Committee to Protect Journalists’ impunity index.

*Case summaries provide an overview of cases and do not have precedential value.

Full Case Decision

  1. Case Description and Background

In January 2025, four posts in Somali were published on a Facebook page, discussing recent socio-political events concerning the self-proclaimed Republic of Somaliland. The Facebook page describes itself as engaging in freelance journalism and has about 90,000 followers. Facebook pages are managed by users through their accounts and alongside their personal profiles. Pages allow people and businesses to create a profile to connect with audiences.

Two of the posts are about Somaliland President Abdirahman Mohamed Abdullahi’s recent foreign policy engagements. The posts include photos of a foreign trip taken by President Abdullahi with captions, in Somali, stating that the Somaliland authorities had prohibited media coverage about the trip. Two other posts relate to a public, official ceremony in Somaliland and a political conference, with descriptive captions.

Two users reported the page for violating the Dangerous Organizations and Individuals and Hateful Conduct policies. A human reviewer found the page violated the Hateful Conduct policy and it was “unpublished”(meaning the page was removed, a measure similar to account deactivation). None of the four posts were reported, but the same human reviewer removed each of them for violating the Hateful Conduct policy. The content creator’s account managing the page received a strike

The content creator appealed Meta’s decision to unpublish the page and, separately, the four content removal decisions. Their appeals against the four content decisions were reviewed by six human reviewers, resulting in the initial decisions being upheld. The appeal against the decision to unpublish the page was not prioritized for human review and as a result, the page remained unpublished.

After the Board selected these cases, Meta reversed its initial decisions to unpublish the page and remove the four posts, reinstating them all and reversing the strike against the content creator’s account.

In addition to the four posts selected in this case, the Board identified 10 other appeals from Somaliland contesting content removals that Meta later confirmed were errors and reinstated.

The Board notes the following context in reaching its decision:

The self-proclaimed Republic of Somaliland, though considered by the Federal Republic of Somalia to be a constituent province, declared its independence from Somalia in 1991. Despite multiple efforts, Somaliland has not gained international recognition. Local and regional tensions, including unresolved border disputes, periodic clan-based conflicts, and security threats from terrorist groups such as Al-Shabaab, persist.

Notwithstanding relative stability in Somaliland, press freedom is a serious concern, as in Somalia more broadly. The Committee to Protect Journalists (CPJ) ranked Somalia third in its 2024 Global Impunity Index (an annual ranking of impunity for the murder of journalists per capita), and first every year from 2015 to 2022, calling the country one of the “worst offenders on the index” (CPJ does not report separately on Somaliland). According to media freedom organization Reporters without Borders (RSF), journalists working in Somalia, including Somaliland, are at risk of arrest, detention and even death. Since 2010 more than 50 journalists have been killed, making Somalia one of the most dangerous countries for journalists in the world. RSF reports that authorities in Somaliland are particularly repressive and put enormous pressure on the local media. Similarly, Freedom House reports that journalists and other public figures face pressure from authorities in Somaliland and, while there are media outlets across print, TV and online, many are politically affiliated.

Reporting on government corruption, foreign policy and a region’s political status is highly sensitive and routinely results in retaliation. In 2022, after critical coverage of an international engagement of then-President of Somaliland Muse Bihi Abdi, intelligence agents allegedly assaulted three freelance journalists, and another local journalist with a substantial Facebook following was arrested. In 2023, 14 journalists were arrested for covering opposition protests. Often, alleged affiliation with opposition groups results in detention. In January 2024, the arrest and reported torture of MM Somali TV staff further reflects the normalization of abuse against those undertaking critical coverage of the government. The effects of repression against those who contradict official narratives is coupled with terrorist threats to journalists’ physical safety (see public comment by Somali Journalists Syndicate, PC-31295). In this context, journalists in Somaliland have coined the phrase “convicted as charged” in reference to the practice of prosecuting media professionals, which became frequent.

Beyond physical attacks and arbitrary arrests, institutional censorship systematically undermines journalistic independence. The Somaliland government’s ban on BBC broadcasts in 2022 demonstrates a rejection of international media scrutiny and severe limitation on access to information for people in Somaliland. Similarly, the December 2023 prohibition of a Universal TV debate – citing "immorality" and cultural and Islamic values – reflects a growing tendency to conflate political control with cultural and religious guardianship, leaving little space for public discourse.

In this hostile environment for traditional media, digital platforms have emerged as essential spaces for the dissemination of news, critical engagement and preservation of editorial autonomy. Social media offers an indispensable avenue for journalists and other commentators to bypass state-imposed restrictions and connect with both domestic and international audiences. Facebook plays a critical role for news and information sharing, with the Somali Journalists Syndicate describing it as the “single largest platform for journalists and local communities” in Somaliland and Somalia, allowing journalists to avoid institutional censorship and connect with their audiences directly (see PC-31295). Data for July 2025 shows that about six out of 10 social media visits in Somalia, including Somaliland, go to Facebook.

2. User Submissions

The user who appealed Meta’s decisions to the Board explained that the intention across all four posts was to share information, not to attack or discriminate against any individual or group, and that the posts did not violate the Hateful Conduct policy.

3. Meta’s Content Policies and Submissions

I. Meta’s Content Policies

Hateful Conduct

Meta defines “hateful conduct” as “direct attacks against people” on the basis of protected characteristics, but the company has now acknowledged that this policy was not engaged in these cases.

Pages

Pages are distinct from profiles and groups and designed to offer a separate presence on Facebook and connect with broader audiences. They are also distinct from personal accounts, which are associated with an individual, whereas a page is created and managed through a personal account but serves as a separate profile with professional tools to monetize and advertise.

Meta’s Transparency Center states that pages have to be managed by authorized representatives and comply with Meta’s broader Community Standards, including restrictions on prohibited content and misuse. Pages can be removed if their name, description, cover image or content created by their administrators violate the Community Standards. Page administrators can be held responsible not only for their own posts on the page, but also for approving violating content posted by other people on the page. Pages that repeatedly violate policies may be removed from recommendations, have their distribution reduced, lose access to monetization features or be unpublished.

Facebook’s Help Center explains that pages may be unpublished or deactivated if they “post spam [or] content that may mislead people”, act against Meta’s Terms of Service or breach its Hateful Conduct or Ads policies. If a page is found to be “deceptively” generating likes, Meta can also impose limits on it, such as disabling the like button. The same Help Center section notes that users can submit separate appeals against decisions to unpublish or restrict pages through a dedicated process available on the page.

II. Meta’s Submissions

Meta unpublished the page and removed the four individual posts for violating the Hateful Conduct policy.

After the Board selected the cases, the company reversed these initial decisions citing human error, restored all posts, re-published the page and reversed the strike against the user’s account. Meta acknowledged that none of the posts contained attacks on individuals based on protected characteristics and that they should not have been removed. At the Board’s request, Meta investigated why this happened. The company was not able to provide the Board more information about the reasons for the initial decisions. For the eight reviews on appeal, Meta found those errors were caused by the human reviewers’ “lack of concentration due to fatigue” and tooling issues which automatically actioned tasks when moderators stepped away and left the review job open. The company noted it has since conducted coaching sessions and plans to implement additional training to both improve reviewers’ focus and refresh policy comprehension and “managing bias.”

Page Review

A review of a page can be initiated following reports from users, external partners who have dedicated channels for reporting potential violations, or through automated detection. Human moderators review the content posted through the page alongside the main elements of the page itself (i.e. bio, title, description, cover or page photo).

Meta uses distinct, confidential criteria for applying strikes to pages and does not disclose these publicly. Multiple content removals in a short period of time generally result in a single strike against the page.

A page may also be unpublished when its purpose or other page elements violate the Community Standards. These include violations in the name, description or cover photo, or when page administrators create content, such as posts, comments or rooms that violate the Community Standards. The decision to unpublish a page can be made by a single reviewer.

In this case, as Meta enforced against all four posts at the same time, this would have resulted in a single strike against the page. Page unpublication would therefore require a violating page element. After reassessing its original decision, Meta noted that there were none.

Appeals

Users can appeal page unpublication separately from appealing the removal of content posted on the page, though these appeals are routed through different market-based queues. In both queues, appeals are prioritized based on similar criteria, such as virality, severity and likelihood of violation.

Human reviewers can be involved in reviewing multiple tasks in relation to the same review job, both for initial reviews and on appeal. In this case, the Somali market reviewer who removed the posts and unpublished the page was also assigned to review an appeal against one of their own content removal decisions. The appeals of the four original content decisions were each reviewed twice, by six human moderators, including the initial moderator. The user also appealed the decision to unpublish the page but it was never prioritized for review.

Mistake Prevention Systems

Cross-Check

Meta’s cross-check system is part of its mistake-prevention strategy designed to provide additional layers of review for potentially violating content, including on pages where mistakes can have especially negative impacts on Meta’s business partners. Cross-check consists of two components: General Secondary Review (GSR), which can be applied to any content based on automated prioritization, and Sensitive Entity Secondary Review (SSR), which applies to the review of posts from entities that Meta includes on its internal lists. Content is prioritized for review based on factors like risks of overenforcement, topic and entity sensitivity and the severity of potential violations or enforcement actions.

Pages may be included in SSR based on criteria such as: a) the type of user or entity, including civic and government entities (e.g., elected officials and human rights organization), “media organizations, businesses, communities and creators, including advertisers,” b) entities historically exposed to over-enforcement, c) significant world events and d) legal and regulatory requirements (see Policy advisory opinion on Meta’s cross-check program).

Entities, including journalists, are often added to SSR lists when they have been subject to overenforcement or have come to Meta’s attention due to their involvement in significant world events, like reporting on crisis and conflicts. Follower count is one of the criteria Meta uses to determine when to include pages in SSR. The page in this case was not part of the SSR program at the time of posting. Meta declined to provide the Board with an overview of East African media entities and journalists included in SSR, citing operational constraints and concerns the data could be misinterpreted.

Decisions to remove content posted by pages and decisions to unpublish pages or disable profiles are eligible for cross-check, including GSR. In these cases, Meta suggested its ranking systems determined the page and the four posts warranted initial human review, but Meta was unable to provide information on why higher levels of review of the page (e.g. through additional scaled review or escalation) did not occur under GSR.

Other Mistake Prevention systems

Dynamic Multi Review (DMR) is a mistake prevention system that enables the company to send reviewed cases back for review to get a majority of consistent decisions across different reviewers before enforcement action is taken. DMR was not enabled for the initial reviews, but was activated for the appeal queue for the content decisions only, resulting in each of the four content appeals receiving an additional review. It was not activated for the page review. DMR is activated based on criteria such as violation severity and human review capacity. However, in this case it is unclear whether all review jobs were completed, or if some were automatically actioned due to a tooling problem. This resulted in the initial content decisions being upheld, without a third review taking place.

These reviews did not meet the company's conditions for activating any other mistake prevention systems.

Journalist Registration Program

Meta’s Journalist Registration Program grants enrolled journalists enhanced security protections, such as extra safeguards against harassment, impersonation, hacking and other threats. It does not automatically guarantee enlistment in SSR, but Meta can consider signals like being a journalist. Only journalists affiliated with a news organization registered as a “news Page” can enroll in the Journalist Registration Program and must verify identity by submitting five bylined articles, a staff directory biography link, or a professional news organization email address. Enrollment is available in a limited number of regions, but not in Somalia or Somaliland. Meta was not able to provide country eligibility criteria for this program.

The Board asked Meta 26 questions covering the review and appeal process for pages, mistake prevention, cross-check and automation systems used in moderating pages, as well as content moderation resources and capacity for Somalia, including Somaliland. These questions also addressed Meta’s support and protection programs for media and journalists. Meta responded to 23 questions fully and partially responded to three.

4. Public Comments

The Oversight Board received two public comments that met the terms for submission. One comment was submitted from Sub-Saharan Africa and the other from Asia Pacific and Oceania. To read public comments submitted with consent to publish, click here.

The submissions covered the following themes: best practices in content moderation for public interest content and pages; the media environment and challenges for independent journalism in Somaliland; the role of social media in supporting independent reporting in the region; and the impact of Facebook’s content moderation mistakes on civic discourse and critical journalism.

5. Oversight Board Analysis

The Board analyzed Meta’s decision in this case against Meta’s content policies, values and human rights responsibilities, as well as its implications for Meta’s broader approach to content governance. In particular, the Board examined the importance of accurate page enforcement for public interest journalism in regions where press freedom is under attack, noting Meta's January 7, 2025 “more speech, fewer mistakes” announcement.

5.1 Compliance With Meta’s Content Policies

Content Rules

The Board finds that the four posts did not violate any of Meta's content policies, including its rules on Hateful Conduct, and should not have been removed. Even if they had been violating, this should have resulted in a single strike, which would not be a sufficient basis to unpublish the page. As Meta now concedes, no elements of the page were violating. Without any prior strikes against the page, no penalties should have been imposed, and unpublication was therefore a severe and entirely unjustified response.

Meta’s failure to protect public interest journalism in these cases is a concern. Social media platforms, particularly Facebook, have become essential tools for independent journalists in Somaliland. Given that the page appears to be run by a freelance journalist with a significant following of about 90,000 people, posting on sensitive political and social matters, the impact of these failures in quality control, mistake prevention and access to appeals are serious. They further exacerbate a hostile environment for speech marked by governmental repression and censorship, in a context where malicious reporting from state-affiliated actors is a significant risk (see also public comment by the Legal Journal on Technology, PC-31314).

5.2 Compliance With Meta’s Human Rights Responsibilities

The Board finds that removing the content and page was inconsistent with Meta’s human rights responsibilities.

  1. Freedom of Expression (Article 19 ICCPR)

Meta’s content moderation practices can have adverse impacts on the right to freedom of expression. Article 19 of the International Covenant on Civil and Political Rights (ICCPR) provides broad protection for this right, given its importance to political discourse ( General Comment No. 34, paras. 11 and 38). The United Nations (UN) Human Rights Committee emphasizes that “a free, uncensored and unhindered press ... is essential” and “constitutes one of the cornerstones of a democratic society” ( General Comment No. 34, para. 13).

When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s voluntary human rights commitments, in relation to both the individual content decisions under review and to Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of opinion and expression has stated, although “companies do not have the obligations of Governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression” ( A/74/486, para. 41).

I. Legality

The principle of legality requires rules limiting expression to be accessible and clear, formulated with sufficient precision to enable an individual to regulate their conduct accordingly ( General Comment No. 34, para. 25). Additionally, these rules “may not confer unfettered discretion for the restriction of freedom of expression on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not” (ibid). The UN Special Rapporteur on freedom of expression has stated that when applied to private actors’ governance of online speech, rules should be clear and specific ( A/HRC/38/35, para. 46). People using Meta’s platforms should be able to access and understand the rules and content reviewers should have clear guidance regarding their enforcement.

Meta does not have a single, centralized resource documenting its content policies governing pages, which raises issues of clarity and transparency. Information about Meta’s approach to page governance is scattered across six different locations, including the Transparency Center, Help Center and Business Center. Policy resources on Meta’s Business Center require logging into a Facebook account to be accessed. Examples include:

These resources are inconsistent in the level of detail provided and, at times, vague or even contradictory. This makes it difficult and confusing for users and reviewers to understand the rules and raises significant legality concerns.

Meta’s public-facing language provides limited details about the process and criteria for Meta taking action against pages. For example, Meta does not provide details about the criteria for unpublishing a page, aside from the general information described in the linked resources above. While publicly Meta says it applies its strike system to content posted from pages, it has distinct, confidential criteria for unpublishing pages.

Similarly, while the Transparency Center lists violations that can occur through the key elements of the page, such as “the name, description or cover photo,” it does not explain the threshold, provided internally to reviewers, that needs to be met for a page to be unpublished.

Meta should ensure that rules on page enforcement are easy for users to access and navigate, ideally consolidated in a single, centralized resource, and clearly understandable.

II. Legitimate aim

In international human rights law as applied to states, any restriction on freedom of expression should also pursue one or more of the legitimate aims listed in the ICCPR, which includes protecting the rights of others.

The Board has previously held that Meta’s Hateful Conduct policy aims to protect the rights of others, a legitimate aim that is recognized by international human rights standards (see e.g., Posts Supporting UK Riots, Myanmar Bot decisions).

III. Necessity and Proportionality

Under ICCPR Article 19(3), necessity and proportionality require that restrictions on expression, “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected,” ( General Comment No. 34, para. 34). However, according to the Special Rapporteur, companies should “demonstrate the necessity and proportionality of any content actions (such as removals or account suspensions)” ( A/HRC/38/35, para. 28). Companies are required “to assess the same kind of questions about protecting their users’ right to freedom of expression” ( A/74/486, para. 41).

The Board finds that the removal of these posts, and the page unpublication, was without basis in any rule, and was therefore not necessary but entirely arbitrary and disproportionate. These cases indicate a systemic problem of poor review quality for Somali language content compounded by inadequacies in Meta’s mistake prevention programs and appeals processes. Meta’s confirmation that it mistakenly removed ten other posts appealed to the Board supports this conclusion. This is especially concerning considering the restrictive environment for press freedom and freedom of expression more broadly in Somaliland. Government use of censorship through restrictive laws and intimidation to control the media and silence critical reporting has driven a heavy reliance on social media, and especially Facebook, which the company benefits from (see public comment by Somali Journalists Syndicate, PC-31295). In this environment, Meta has a heightened responsibility to protect freedom of expression and ensure access to information. Given the severe consequences Meta’s page unpublication decisions and content removals can have for journalists in this region, the company must be especially attentive to ensuring accurate enforcement so that it does not further contribute to adverse human rights impacts.

Mistake prevention

Mistake-prevention measures are essential to ensure that speech is not restricted unnecessarily. By reducing wrongful removal of content and pages, these systems are designed to help Meta pursue safety goals through less intrusive means, aiming to address harmful content without placing an undue burden on freedom of expression.

Meta’s failure to include the page in its SSR cross-check system points to a potential broader systemic problem, where pages like this one appear to meet the applicable criteria for enlisting but are nevertheless not proactively included. Had this page been enrolled, the removal of the four posts and the page unpublication would have been escalated for secondary review in-house, allowing Meta to catch and correct the errors and even look to investigate their root cause. This highlights the need for more proactive identification and enrollment of vulnerable actors into SSR, in particular for journalists and other prominent commentators working in places like Somaliland. These entities may not be so commercially significant for Meta, but the consequences for freedom of expression of excluding them from these protective mechanisms are acute. SSR criteria should be adapted to market realities: thresholds like follower count should be scaled to be proportionate to the number of users in the market. Civic actors who regularly engage in public interest expression, particularly in repressive or closed information environments like Somaliland, should be prioritized.

GSR cross-check also appears to overlook important public interest content from independent journalists and freelance reporters and did not make up for the limitations of SSR enrollment in this context. As designed, GSR should have detected the posts and the page for additional review, but did not, suggesting that the system is not attaching sufficient weight to the most relevant detection criteria in this context. The Board has previously expressed concerns that even when activated, content eligible for GSR rarely reaches Meta’s subject matter experts and therefore lacks a level of review where contextual analysis can be applied (see Policy Advisory Opinion on Meta’s cross-check program). The local context of public-interest reporting, the speaker’s reach relative to the size of the market, and the wider environment for civic and media freedom in Somaliland should have led to the content being prioritized for additional, in-house review under GSR.

Meta’s additional mistake-prevention systems such as DMR were either not effective in preventing mistakes in this case or not used at all. While DMR was activated for the content appeals, it failed to correct the erroneous enforcement decisions due to issues of reviewer accuracy and tooling problems. DMR was not activated for the page unpublication decision, or for the initial content decisions. These decisions were not flagged for additional review under other mistake prevention systems because they did not meet Meta’s threshold for activation. These systems are designed to prevent enforcement errors in high-impact or sensitive scenarios, particularly when potential enforcement consequences are severe, and the Board finds that these cases should have met that threshold.

The Journalist Registration Program is a mechanism that could have helped in this case, as it can influence the inclusion of a journalist or their content in cross-check, although it does not automatically guarantee it. It is concerning that Meta could not provide information about the country eligibility criteria for this program, and that Somalia (including Somaliland) is not eligible, notwithstanding the significant risks journalists in the country face. Similarly, the formal enrollment requirements exclude freelance journalists from the benefits of this program.

This catalogue of shortcomings highlights a failure of mistake prevention impacting the voices who most need these safeguards: people engaging in journalism in repressive environments where audiences rely on Facebook for independent reporting. It was only as a consequence of the Board taking these cases that Meta is taking action to address these issues. These cases should guide Meta to improve enrollment of sensitive entities to SSR cross-check and improve its GSR, DMR and other similar mechanisms’ ranking and detection. Meta should also ensure that its Journalist Registration Program is properly resourced and rolled out in more markets, especially where repressive environments threaten independent reporting, including prominent freelance journalists.

2. Access to remedy

Article 2(3) of ICCPR mandates states to ensure that “any person whose rights or freedoms as herein recognized are violated shall have an effective remedy.” This means access to competent "judicial, administrative, legislative or other appropriate” mechanisms to determine the violation and provide enforcement of the remedy. The UN Human Rights Committee has stated that the “remedies should be appropriately adapted so as to take account of the special vulnerability of certain categories of person” ( General Comment No.31). In the Joint Declaration on Media Freedom and Democracy, the UN Special Rapporteur on freedom of expression and regional freedom of expression mandate holders advise that “large online platforms should privilege independent quality media and public interest content on their services in order to facilitate democratic discourse” and “swiftly and adequately remedy wrongful removals of independent quality media and public interest content, including through expedited human review” (Recommendations for online platforms, page 8).

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. UNGPs bridge the gap between state obligations and corporate responsibilities and serve as a crucial framework for applying the right to an effective remedy, encouraging companies to establish and participate in grievance mechanisms that are “legitimate, accessible, predictable, equitable, transparent, rights-compatible, and a source of continuous learning” (UNGP Principle 31). While international human rights law places the primary duty on states to protect against abuses by private actors, the UNGPs introduce a parallel “responsibility to respect human rights,” requiring businesses, including platforms, to avoid infringing on rights and to address any adverse impacts. UNGPs encourage companies to “establish or participate in effective operational-level grievance mechanisms for individuals and communities who may be adversely impacted,” (Principle 29) and that such mechanisms should be capable of “providing early warning and addressing grievances before they escalate” (Principle 30).

The Board understands that errors happen in content moderation at scale because of volume, complexity, and the limits of automation and human judgment. It also notes Meta's renewed commitment, on January 7, 2025, for “more speech and fewer mistakes.” It is deeply concerned that Meta’s appeals systems failed to provide adequate remedy for the user in these cases. This is especially troubling given the fragile environment for media and independent journalists in Somaliland, and the public’s limited access to information about current events in the territory.

The fact that a human moderator was tasked with reviewing an appeal against decisions they made further undermined the integrity of the process. The Board previously raised concerns about this occurring (see Violence against Women decision), and considers that capacity constraints are not sufficient justification. More robust safeguards are needed, particularly for independent journalists in places like Somaliland.

That these issues only came to light through the actions of this user bringing an appeal to the Board demonstrates the importance of providing users with effective remedy and highlights many fundamental weaknesses in the company’s processes. Notwithstanding eight appeal reviews of the initial content decisions, the correct outcome was not reached, and the appeal against the original Page unpublication that triggered those removals was not reviewed at all. It took Meta a significant amount of time, in the course of these cases, to provide only general information about the root cause of these errors. This highlights a systemic weakness in error assessment and identification. While the Board acknowledges that, for technical reasons, Meta does not configure its moderation tools to capture more detailed information about reviewers’ decision-making, the issues identified in these cases suggest this seriously inhibits Meta’s abilities to investigate and solve systemic enforcement problems.

6. The Oversight Board’s Decision

The Oversight Board overturns Meta's decision to remove the four pieces of content and unpublish the page, which by implication requires that the Page be reinstated as well.

7. Recommendations

Content Policy

1.To better inform users about the rules applying to pages, Meta should create a consolidated resource in its Transparency Center to explain its content policies and enforcement guidance, including the strike system, and ensure this is easily accessible for page administrators.

The Board will consider this recommendation implemented when Meta creates this consolidated resource in the Transparency Center and demonstrates its signposting to page administrators.

Enforcement

2. To ensure access to effective remedy, Meta should revise its appeal processes to prohibit the same human reviewers from assessing appeals against their own decisions, including on page unpublication. This should be done in a way that does not result in an increase in appeal review jobs being auto-closed.

The Board will consider this recommendation implemented when Meta provides the Board with documentation confirming that these rules have been updated, and data showing no associated decline in the rate of appeals reviewed.

3. To help ensure that pages are not erroneously unpublished, Meta should update its General Secondary Review (GSR) ranking algorithm to explicitly prioritize page unpublication decisions for in-house review.

The Board will consider this recommendation implemented when Meta provides the Board with documentation outlining this new prioritization criteria, along with data demonstrating any associated shifts in how frequently unpublishing decisions are reviewed under the GSR program.

4. To protect journalism in regions where media freedom is repressed, Meta should develop new criteria and systems to proactively enrol pages or accounts engaged in journalism in these regions to Sensitive Entity Secondary Review (SSR). Follower thresholds should be adjusted relative to market size, and existing criteria for news organization designation should not be a bar to entry. Trusted authorities like the Committee to Protect Journalists’ impunity index should be used to prioritize high-risk regions.

The Board will consider this recommendation implemented when Meta shares data and information with the Board detailing its new designation criteria for journalistic SSR protections and the number of resulting new designations per market.

*Procedural Note:

  • The Oversight Board’s decisions are made by panels of five Members and approved by a majority vote of the full Board. Board decisions do not necessarily represent the views of all Members.
  • Under its Charter, the Oversight Board may review appeals from users whose content Meta removed, appeals from users who reported content that Meta left up, and decisions that Meta refers to it (Charter Article 2, Section 1). The Board has binding authority to uphold or overturn Meta’s content decisions (Charter Article 3, Section 5; Charter Article 4). The Board may issue non-binding recommendations that Meta is required to respond to (Charter Article 3, Section 4; Article 4). Where Meta commits to act on recommendations, the Board monitors their implementation.
  • For this case decision, independent research was commissioned on behalf of the Board. The Board was assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.* *Linguistic expertise was provided by Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world.*

العودة إلى قرارات الحالة والآراء الاستشارية المتعلقة بالسياسة