Multiple Case Decision

Posts Sharing Speeches in Syrian Conflict

The Oversight Board calls on Meta to add to its tools for moderating content in armed conflict to mitigate information asymmetries its policies may create between different parties to conflicts.

2 cases included in this bundle

Overturned

FB-WK0ZJ2Z9

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
News events,Politics,War and conflict
Standard
Dangerous individuals and organizations
Location
Syria
Date
Published on October 2, 2025
Overturned

FB-XICM1710

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
News events,Politics,War and conflict
Standard
Dangerous individuals and organizations
Location
Syria
Date
Published on October 2, 2025

Summary

The Oversight Board calls on Meta to add to its tools for moderating content in armed conflict to mitigate information asymmetries its policies may create between different parties to conflicts. The Board emphasizes that civilians in rapidly evolving conflicts utilize social media in a manner distinct from non-conflict situations, to quickly share information that can help keep people safe. The impact of prohibitions on channeling communications from an entity designated under Meta’s Dangerous Organizations and Individuals policy on people and their protection against violence needs to be studied. The Board has called on Meta to restore two posts, with newsworthiness allowances, in which users shared content from leaders of the organization Hayat Tahrir al-Sham (HTS), shortly before the toppling of the Assad regime in Syria.

About the Cases

In late 2024, two Facebook users in Syria posted content related to HTS, an organization designated as a terrorist group by the United Nations (UN) Security Council, which led the offensive that overthrew the regime of Bashar al-Assad.

In the first case, a user whose appeal to the Board stated they are a journalist posted a video in Arabic to their page in November. The video showed an HTS commander’s speech encouraging rebel fighters to “attack your enemies and suffocate them.” Addressing Assad’s forces, the commander said, “You have no choice but to be killed, flee or defect.” Meta removed the content less than 15 minutes after it was posted for violating the Dangerous Organizations and Individuals policy. It was viewed almost 5,000 times.

In the second case, an image was posted on a public page in December containing a photograph of HTS leader Ahmed al-Sharaa and Arabic text of part of a speech he gave the same day. The speech encouraged HTS fighters to “not waste a single bullet except in the chests of your enemy, for Damascus awaits you.” The post was automatically removed within minutes for violating the Dangerous Organizations and Individuals Community Standard. The day after, HTS forces took the Syrian capital, Damascus.

Meta prevented the accounts from going live and demoted page reach and visibility. The posting users appealed, and Meta confirmed the content’s removal. The users both appealed to the Board. The company later said the posts also violated its Violence and Incitement policy.

Key Findings

The majority of the Board finds that removing the content was inconsistent with Meta’s human rights responsibilities, and Meta’s relevant policies must be adjusted to ensure such alignment in the future. The public interest in receiving information that could keep people safe in a rapidly evolving conflict situation, where the regime severely limited information flows, and the low likelihood that sharing this content would lead to additional harm are of particular relevance. The Board notes that in this and any political conflict, communication is truncated, making contextual clues as to the motivations for a post less overt to outsiders. Granting a scaled newsworthiness allowance was warranted.

A minority of the Board disagrees, finding the posts’ removal consistent with Meta’s human rights responsibilities and the Board’s precedent. Both posts relay orders to kill, without any commentary and little actionable information to keep civilians safe.

The Board finds that, by channeling communications from a designated group without clear intent to engage in permitted social and political discourse, both posts violate the Dangerous Organizations and Individuals policy. It also finds that both posts violate the Violence and Incitement policy as they contain clear calls for violence.

Meta's refusal to tell users which organizations and individuals cannot be discussed under its Dangerous Organizations and Individuals policy is particularly problematic during armed conflicts, when designated entities may be acting as de facto governing authorities. The policy’s exception for social and political discourse is also insufficiently transparent, as there are significant differences between publicly disclosed information and internal guidance on what is permissible discourse.

Meta enforcing a non-public yet fully operative policy since February 2025 on how people can refer to or share communications from President al-Sharaa in his official capacity does not meet the requirements of legality. Users must be aware of policies like this one to ensure they can understand how they may exercise their expressive rights within Meta’s rules.

The Board notes that Meta’s moderation in the Syrian conflict may have led to questionable information asymmetries that put users at risk. Meta’s policies allow calls for violence against listed entities but prohibit them against regular militaries. This is regardless of either side’s conduct.

The Oversight Board’s Decision

The Board overturns Meta's decisions to take down both posts, requiring them to be restored with a newsworthiness allowance.

The Board also recommends that Meta:

  • Add a lever to the Crisis Policy Protocol that allows the platform to mitigate information asymmetries its policies may create. This could include policy levers such as: suspending the prohibition on sharing information from designated entities involved in the conflict; suspending strikes or reducing feature limits where content is found violating for unclear intent; providing education to users on how to share information about designated entities in permissible ways. When these policy levers are invoked, the measure must be made public.
  • Study, in consultation with impacted stakeholders, how its prohibition on channeling official communications on behalf of a designated entity under the Dangerous Organizations and Individuals policy impacts access to information and protection of civilians against violence in armed conflicts.
  • Report to the Board about its efforts in the last five years to assess whether and how its Violence and Incitement and Dangerous Organizations and Individuals Community Standards should be modified to account for International Humanitarian Law standards, and set out its near-term future plans in this area.

*Case summaries provide an overview of cases and do not have precedential value.

Full Case Decision

1. Case Description and Background

In late 2024, two Facebook users in Syria posted content related to Hayat Tahrir al-Sham (HTS), an organization designated as a terrorist group by the United Nations (UN) Security Council. HTS led a military offensive between November 27 and December 8, 2024, which toppled the Assad regime in Syria. This marked the end of a chapter in an armed conflict that had begun in 2012 after the regime brutally repressed massive peaceful demonstrations that had begun the year before. In early 2025, HTS leader Ahmed al-Sharaa became Syria’s interim president. He currently leads a transitional government with many former HTS members holding senior positions. He has ordered the dissolution of HTS and other armed groups in the country.

In the first case, a user posted a short video in Arabic to their page on November 28, 2024. The video was of a speech given by Abu Zubair al-Shami, an HTS commander, dressed in military fatigues and a face covering. In the speech, he quoted the Quran, cited crimes committed by the Assad regime, celebrated the revolution “of pride and dignity” to “recover rights and remove injustices,” and encouraged rebel fighters to keep fighting and to “attack your enemies and suffocate them.” He also stated, “Today we are living a new phase of our blessed revolution, after the Military Operations Administration launched Operation Deterrence of Aggression,” which was a reference to HTS’ recent operations to overthrow the regime of President Bashar al-Assad. In a section directly addressing Assad’s forces, Mr. al-Shami said, “You have no choice but to be killed, flee or defect.” The user added a caption stating, “The speech of the military commander Abu Zubair al-Shami,” with the hashtags #MilitaryOperationsManagement and #DeterrenceOfAggression in Arabic, referencing the name of the HTS-led command structure and offensive, both mentioned by Mr. al-Shami in the speech. A user reported the content almost immediately and, within 15 minutes, Meta removed it for violating the Dangerous Organizations and Individuals policy. The post was viewed almost 5,000 times in that 15-minute period.

In the second case, an administrator of a public page posted a single image on December 7, 2024, containing a photograph of Mr. al-Sharaa and Arabic text on the image. The text is an excerpt from a speech he gave that day, congratulating the group’s revolutionary fighters for inflicting heavy losses on their enemy. He also praised them for releasing prisoners of the Assad regime and for replacing “the darkness of injustice and tyranny with the light of justice and dignity.” He urged them to “leave the liberated cities that God has bestowed upon you to your brothers in the police and security so that they may stand on their borders and perform their duty.” This was a reference to territorial gains HTS and allied militias made as they advanced south from their stronghold in Idlib, taking other towns and cities, until they reached the capital, Damascus, a few days later. He encouraged them to keep fighting to liberate Syria and restore people’s rights, and “not waste a single bullet except in the chests of your enemy, for Damascus awaits you.” The post was automatically detected for violating the Dangerous Organizations and Individuals Community Standard and removed within minutes of being posted. The day after the content was posted, HTS forces entered Damascus, meeting little to no opposition, marking the end of the Assad regime.

Due to the seriousness with which Meta treats violations of the Dangerous Individuals and Organizations Policy, the company applied a severe strike to each user’s account and applied feature limits to them and the pages the content was posted to, preventing them from going live and demoting the pages’ reach and visibility. In both cases, the posting users appealed the decisions to the company, which confirmed its removals of both posts, and they both appealed to the Board. When the Board selected these cases for review, Meta noted that both posts also violated its Violence and Incitement policy.

A 2024 UN report said that state and non-state actors in Syria, including HTS, committed human rights violations with impunity throughout the conflict. The UN Commission found, among other things, “continuing patterns of crimes against humanity and war crimes” by the Assad government, and that HTS members had engaged in torture, cruel treatment and extra-judicial killings of civilians, which could amount to war crimes. Hundreds of thousands of Syrians were killed between 2011 and 2025, the vast majority of them by Assad’s forces and their allies.

2. User Submissions

In the video case, the user explained they were a journalist who posted the video to inform and educate. They stated the video played an important role in sharing information with the public and that its removal undermines press freedom.

In the photograph case, the user questioned why Meta allowed images and quotes from former President al-Assad throughout the Syrian conflict. They called former President al-Assad a dictator and Mr.al-Sharaa a revolutionary. They argued that Meta’s approach, including allowing pictures of former President al-Assad and not Mr. al-Sharaa, amounted to supporting dictatorship and bloodshed, while “restricting freedom of opinion.”

3. Meta’s Content Policies and Submissions

I. Meta’s Content Policies

Dangerous Organizations and Individuals

The Dangerous Organizations and Individuals policy rationale states that, in an effort to prevent and disrupt real-world harm, Meta does not allow organizations or individuals that proclaim a violent mission or that are engaged in violence to have a presence on its platforms. Meta maintains a list of designated organizations and individuals divided into two tiers, with Tier 1 subject to the most extensive enforcement. Under Tier 1, Meta specifies that they “do not allow” designated individuals or organizations, or leaders or prominent members of these organizations, “to have a presence on the platform.” They also “remove any support for these individuals and organizations.”

Tier 1 entities are described as those that “engage in serious offline harm including organizing or advocating for violence against civilians.” Tier 1 includes entities and individuals designated by the United States government as “foreign terrorist organizations (FTOs) or specially designated global terrorists (SDGTs),” as well as entities that Meta independently determines meet its Tier 1 criteria. The company provides more information on how entities are designated and de-designated here. Both HTS as an organization and President al-Sharaa as an individual were designated under Tier 1 at the time of the posts at issue in this case. On July 8, 2025, while the Board was considering this case, the United States revoked its designation of HTS as an FTO and, at the time this decision was published, Meta was evaluating whether HTS met the company’s criteria for removal from its own list of designated entities. At the time of publication of this decision, the UN and several countries maintained their designation of HTS as a terrorist entity.

Meta removes “glorification, support and representation” of Tier 1 entities, their leaders, founders or prominent members, as well as “unclear references” to them. Prohibited forms of “support” include “channelling information, including official communications, on behalf of a designated entity or event.” Meta provides as an example of channelling, someone directly quoting a designated entity without a caption that [1] condemns, [2] neutrally discusses or [3] is a part of news reporting. Such captions would fall within Meta’s “social and political discourse” exception. Meta defines reporting as content that “includes information that is shared to raise awareness about local and global events in which designated dangerous organizations and individuals are involved.” The policy states that this exception requires a clear indication of intent from the posting user. Meta requires clear statements of intent for social and political discourse because it wants to allow these types of discussions while still limiting offline harm. Where a user’s intent is ambiguous or unclear, Meta defaults to the removal of the post. Meta has previously clarified to the Board that this exception was not intended as a loophole that would permit posts providing tangible operational or strategic advantage to a designated entity by letting third parties distribute official campaign material and official propaganda, or allowing official channels of communication on behalf of such groups. Meta also noted that, to permit channeling information within the exception would essentially allow designated entities to circumvent Meta’s policies to share their agendas (see Greek 2023 Elections Campaign decision).

In response to the Board’s questions Meta explained that, in addition to the three examples of categories of social and political discourse listed in the public-facing policy, Meta’s internal guidance includes many more categories and the stipulation that references to designated entities must fall into one of them in order to be permitted, either by being explicitly mentioned or through an unambiguous indication of intent to discuss this subject. The full list of permitted social and political discourse categories is: elections; parliamentary and executive functions; conflict resolutions (truces/ceasefires etc); international agreements or treaties; disaster response and humanitarian relief; local community services; human rights and humanitarian discourse; neutral discussion and fictional depiction of a designated entity and their behaviors; news reporting; condemnation and criticism; satire and humor; and legal discussion around a designated entity and perpetrators of violating violent events.

Internal Guidance and Crisis Policy Protocol

On February 25, 2025, months after the content in question was posted and soon after Mr. al-Sharaa became Syria’s interim leader, Meta issued “internal, at-scale, global, and time-bound policy guidance” temporarily changing the enforcement of the Dangerous Individuals and Organizations Community Standard in Syria with respect to President al-Sharaa. This confidential guidance allowed content that would otherwise be considered channeling official communications from and on behalf of President al-Sharaa exclusively when shared in his official capacity as Syria’s interim president. This guidance, which was only made public through the Board’s announcement of these cases, covers “posts, videos, or images of presidential engagements, public statements, decisions, press releases, speeches, and interviews, especially those officially shared by the Syrian presidency.” Under this guidance, Meta continues to remove glorification, support or representation of HTS. If communications by or on behalf of President al-Sharaa include other policy violations, Meta instructs its reviewers to remove the content.

On May 20, Meta updated the guidance to allow content that “references President al-Sharaa,” including positive references. Meta confirmed that the guidance applies to Facebook, Instagram and Threads, and that it does not have similar guidance for any other member of the Syrian transitional government or other prominent figures from the Syrian conflict.

Meta explained that this guidance was one of several measures taken after designating Syria a crisis in early December 2024. This designation was made under Meta’s Crisis Policy Protocol (CPP), which the company created in response to one of the Board’s previous recommendations. Other measures undertaken pursuant to the CPP included removing calls to arms or for civilians to be armed, removing claims outing people as being associated with the Assad regime and “launching a trending event for third-party fact-checkers to be able to quickly identify and debunk false claims related to the conflict.” Trending events are a tool Meta uses to more proactively detect content related to important circumstances that it considers high-risk for viral misinformation. Meta compiles a list of relevant keywords that are used to identify relevant content that could spread misinformation. Content identified through this process is labelled and made easily filterable for review in the tool used by third-party fact-checkers outside of the United States.

Violence and Incitement

The Violence and Incitement policy rationale states that Meta aims to “prevent potential offline violence that may be related to content on our platforms.” Meta removes “language that incites or facilitates violence and credible threats to public or personal safety.”

Meta removes threats of violence that could lead to death or other forms of high-severity violence. It defines threats as “statements or visuals representing an intention, aspiration, or call for violence against a target, and threats can be expressed in various types of statements such as statements of intent, calls for action, advocacy, expressions of hope, aspirational statements and conditional statements.” The policy notes that Meta does not “prohibit threats when shared in awareness-raising or condemning context.” The policy also allows for some aspirational calls for violence as well as “certain threats” directed against violent actors and terrorist groups.

Newsworthiness Allowance

Meta can allow content on its platforms that violates its policies when it is considered sufficiently newsworthy. Meta notes that it only does this “after conducting a thorough review that weighs the public interest against the risk of harm.” These are very rare interventions, with only 32 granted between June 2023 and June 2024, 69 granted between June 2022 and June 2023 (17 of which were “scaled,” see below), and 68 granted between June 2021 and June 2022. This allowance can only be issued by Meta’s content policy team on escalation. In determining newsworthiness, Meta assesses whether content surfaces an “imminent threat to public health or safety or gives voice to perspectives currently being debated as part of a political process.” It also considers other factors, including country-specific circumstances, the nature of the speech (especially whether it relates to governance or politics) and the political structure of the country, including whether it has a free press.

Meta removes potentially newsworthy content when “leaving it up presents a risk of harm, such as physical, emotional and financial harm, or a direct threat to public safety.” Newsworthy allowances can be “narrow,” i.e., applied to a single piece of content. Or the allowance can be “scaled,” which “may apply more broadly to something like a phrase.” Meta stated to the Board that it did not issue any newsworthiness allowances during the months before the fall of the Assad regime.

II. Meta’s Submissions

Dangerous Organizations and Individuals

Meta stated that the video case violated the Dangerous Organizations and Individuals policy by giving support to HTS via channelling its official communications, therefore sharing material produced by a designated entity. Meta noted that the video featured an HTS military leader and appeared to be created by HTS, as Meta determined that it did not contain any signals that it was produced by a third-party, such as a news entity. Unlike the content in the photograph case, the user included a caption identifying the speaker and hashtags – #MilitaryOperationsManagement and #DeterrenceOfAggression. However, Meta explained that, although it may have been posted by a journalist, the post did not qualify as news reporting as the video was produced by HTS and the user shared official communications from them in full without any editorial intervention to make clear it was news reporting.

Meta also removed the content in the photograph case for giving support to HTS, a designated organization, and Mr. al-Sharaa, a designated individual, by channelling their official communications. Meta noted that the content is a caption-less reshare of what appears to be an official statement by Mr. al-Sharaa made in his previous capacity as HTS commander during an active conflict. Meta explained that, as the content did not have a caption, it was unclear what the user’s intent was in sharing it. As a result, the content would not fit under any allowable contexts under the social and political discussion exception, including news reporting or neutral discussion.

Meta also noted that, as the content was shared before Mr. al-Sharaa became Syria’s president, the internal policy guidance allowing users to channel official communications from him in that capacity would not have applied.

Meta also stated that neither post would be allowable as news reporting within permitted social and political discourse. This was because “there is no evidence that the content was shared for the purpose of improving the understanding of an issue or knowledge of a subject that has public interest value.” Meta noted that the photograph post was shared without a caption or greater awareness-raising context. While the video had a caption and was posted by a journalist, it was “unclear if they posted it to raise awareness. The caption simply restates facts already included in the video without additional context, explanation or further discussion.”

Violence and Incitement

Meta explained that both posts also violated its Violence and Incitement policy.

Meta found the content in the video case contained two violations of the policy. In the video, Mr. al-Shami calls on HTS fighters to “attack your enemies and suffocate them,” which Meta interpreted as a threat of high-severity violence (a call to action to attack and kill the Assad regime’s soldiers and/or affiliates). Additionally, Meta found the statement in the video contained a conditional threat directed at al-Assad’s soldiers, saying, “You have no choice but to be killed, flee or defect.”

In the photograph case, Mr. al-Sharaa urged HTS fighters and supporters to “not waste a single bullet except in the chests of your enemy.” Meta took this to be a conditional threat to shoot and kill the “enemy,” which it understood as the Assad regime’s soldiers and/or its affiliates.

Newsworthiness

Meta confirmed that it did not view either post as newsworthy for the purpose of applying a policy allowance. In assessing the public interest value of the content, Meta noted that it “removed the content in these cases to prevent and disrupt potential offline harm in light of the documented atrocity crimes committed by all sides of the conflict, the violent nature of the threats contained in the posted speech, the status’ of the speakers in each case … and the escalating situation in Syria at the time the content was posted.”

The Board asked Meta questions about how it changed its policies and enforcement practices to respond to the Syrian conflict and its resolution, as well as on the application of the Dangerous Organizations and Individuals policy to parties to the conflict. Meta responded to all questions.

4. Public Comments

The Oversight Board received two public comments that met the terms for submission. One of the comments was from the Middle East and one was from Sub-Saharan Africa. To read public comments submitted with consent to publish, click here.

The submissions covered the following themes: the humanitarian record of the Assad regime and the new Syrian government; that threats and violent rhetoric on social media can manifest into violence in Syria; the need to moderate content that targets minorities in Syria; and that content circulating news and political developments should not be censored.

5. Oversight Board Analysis

The Board selected these cases to address how Meta’s content policies and enforcement affect freedom of expression during rapidly developing conflicts, with a particular focus on Syria’s war, in which people were sharing information about a proscribed organization engaged in the conflict.

The Board analyzed Meta’s decisions in these cases against Meta’s content policies, values and human rights responsibilities. The Board also assessed the implications of these cases for Meta’s broader approach to content governance.

5.1 Compliance With Meta’s Content Policies

Content Rules

The Board finds that both posts violate the Dangerous Organizations and Individuals policy. Both contain official communications from HTS, one a speech from its commander and the other a video address from an HTS military leader. The posts provide support, as that concept is described in the Community Standards, to Mr. al-Sharaa and HTS by channeling their communications to audiences without clear indications that the users shared the content to engage in permitted social and political discourse, for example, in the form of neutral discussion. Had the users added even brief commentary indicating they were trying to do so, then the posts may have been permissible as social and political discourse and not have been removed. However, in the absence of clear indicators of that intent, the posts violate the plain meaning of the policy.

The Board also finds that the posts violate the Violence and Incitement policy. The content in the video case calls for HTS fighters to “attack your enemies and suffocate them” before telling their enemies that they will be killed if they do not flee or defect. In the photograph case, the content calls for HTS’s fighters and allies to shoot their enemies.

In assessing whether the posts violated these two Community Standards, the Board noted the complexity of seeking to apply these rules to situations of armed conflict, as these policies do not engage with relevant international humanitarian law standards or address how they would apply differently in an armed conflict.

Newsworthiness Allowance

For the reasons set out in the necessity and proportionality analysis below, a majority of the Board finds that, despite violations of the Dangerous Organizations and Individuals and the Violence and Incitement policies, a scaled newsworthiness allowance should have been applied in both cases. This would allow the posts and others sharing the same video and image with unclear intent during that period to remain on the platform if the accompanying captions did not include any other violations of Meta’s policies. A minority of the Board disagrees and has included a dissenting opinion in the necessity and proportionality analysis below.

5.2 Compliance With Meta’s Human Rights Responsibilities

A majority of the Board finds that removing the content from the platform was not consistent with Meta’s human rights responsibilities and that Meta’s relevant policies must be adjusted to ensure such alignment in the future.

Freedom of Expression (Article 19 ICCPR)

Article 19 of the International Covenant on Civil and Political Rights ( ICCPR) provides for broad protection of expression , including political expression. This right includes the “freedom to seek, receive and impart information and ideas of all kinds.” Each aspect of this right should be respected, including during armed conflicts, in particular to ensure that civilian populations have access to information that may be crucial to their understanding of the latest conflict developments and dynamics. This concern should continue to inform Meta’s human rights responsibilities, alongside the mutually reinforcing and complementary rules of international humanitarian law that apply during such conflicts (see General Comment No. 31, Human Rights Committee, 2004, para. 11; Commentary to the UN Guiding Principles on Business and Human Rights (UNGPs), Principle 12; see also UN Special Rapporteur on freedom of expression’s report on disinformation in armed conflicts, A/77/288, paras. 33-35 (2022)).

Access to information in a conflict can mean the difference between life and death. The UN Special Rapporteur on freedom of expression has stated that “during armed conflict, people are at their most vulnerable and in the greatest need of accurate, trustworthy information to ensure their own safety and wellbeing. Yet, it is precisely in those situations that their freedom of opinion and expression … is most constrained by the circumstances of war and the actions of the parties to the conflict and other actors to manipulate and restrict information for political, military and strategic objectives” (A/77/288, para. 1). The report also notes that the manipulation of information is a common feature of armed conflict that ranges from attempts to deceive opposition to attempts to influence civilians or stir up hatred. Social media platforms play a dual role in conflicts, providing both a means for people to remain connected to the outside world as well as to receive a wide range of critical life-saving information while also serving as vectors of disinformation and hate speech. Nevertheless, the Board notes that knowing which actors are spreading what information, including when it is misleading, may still provide important context to unfolding events and allow people to more comprehensively assess the risks they face.

When restrictions on expression are imposed by a state, they must meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). These requirements are often referred to as the “three-part test.” The Board uses this framework to interpret Meta’s human rights responsibilities in line with the UNGPs, which Meta has committed to in its Corporate Human Rights Policy. The Board does this in relation to the individual content decision under review and what this says about Meta’s broader approach to content governance. As the UN Special Rapporteur on freedom of expression has stated, although “companies do not have the obligations of governments, their impact is of a sort that requires them to assess the same kind of questions about protecting their users’ right to freedom of expression” ( A/74/486, para. 41).

I. Legality (Clarity and Accessibility of the Rules)

The principle of legality requires rules limiting expression to be accessible and clear, formulated with sufficient precision to enable an individual to regulate their conduct accordingly (General Comment No. 34, para. 25). Additionally, these rules “may not confer unfettered discretion for the restriction of freedom of expression on those charged with [their] execution” and must “provide sufficient guidance to those charged with their execution to enable them to ascertain what sorts of expression are properly restricted and what sorts are not” ( ibid.). The UN Special Rapporteur on freedom of expression has stated that when applied to private actors’ governance of online speech, rules should be clear and specific ( A/HRC/38/35, para. 46). People using Meta’s platforms should be able to access and understand the rules and content reviewers should have clear guidance regarding their enforcement.

The Board reiterates its concerns, first stated in the Nazi Quote decision, that Meta’s refusal to disclose its list of designated entities makes the Dangerous Organizations and Individuals policy insufficiently clear, as users do not know which entities they can channel communications from. While Meta declined a Board recommendation to fully disclose its list of designated entities, it agreed in June 2024 to a Board recommendation made in April that year ( Sudan Rapid Support Forces) that it link to the public United States designations lists where these are referenced in the policy. The Board notes that whereas governments generally disclose terrorist designations, Meta does not. Meta has not yet implemented this recommendation. This continued lack of transparency is particularly problematic during armed conflicts, when the need to discuss these entities’ conduct is especially pressing and designated entities may be acting as de facto governing authorities.

The Board finds that the Dangerous Organizations and Individuals carveout for social and political discourse is also insufficiently transparent, given the significant differences between the publicly disclosed information (limited to exceptions permitting “reporting,” “neutral discussion” and “condemnation” of designated entities or their activities) and the internal guidance provided to reviewers where many other examples of permissible discourse are listed.

Additionally, the Board finds that Meta enforcing a non-public yet fully operative policy since February 2025 on how people can refer to or share communications from President al-Sharaa does not meet the requirements of legality. It is essential that users are aware of policies like this one to ensure they can understand the scope of Meta’s rules and how they may exercise their expressive rights within them.

Finally, as applied to the content in these cases, the Board finds that the Dangerous Organizations and Individuals and Violence and Incitement policies are sufficiently clear, while noting the conceptual challenges of applying these policies in armed conflicts when applicable rules and principles of international humanitarian law are not reflected in these policy lines.

II. Legitimate Aim

Any restriction on freedom of expression should also pursue one or more of the legitimate aims listed in the ICCPR, which includes protecting the rights of others (Article 19, para. 3, ICCPR).

The Board has previously found that Meta’s Dangerous Organizations and Individuals policy aims to “prevent and disrupt real-world harm.” In several decisions, the Board has found that this policy pursues the legitimate aim of protecting the rights of others, such as the right to life (Article 6, ICCPR) and the right to security of the person (Article 9, ICCPR).

The Board has also previously found that the Violence and Incitement Community Standard aims to “prevent potential offline harm” by removing content that poses “a genuine risk of physical harm or direct threats to public safety.” This policy serves the legitimate aim of protecting the right to life and the right to security of person (Article 6, ICCPR; Article 9, ICCPR; General Comment No. 35, para. 9).

III. Necessity and Proportionality

Under ICCPR Article 19(3), necessity and proportionality requires that restrictions on expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment No. 34, para. 34).

A majority of the Board finds that the removal of both posts is neither necessary nor proportionate. The Board has previously used the contextual factors outlined in the Rabat Plan of Action to assess whether imminent harm is likely and whether it can only be averted by removal of content (for example, the Weapons Post Linked to Sudan’s Conflict and Tigray Communication Affairs Bureau decisions). In applying the Rabat contextual factors to these cases, the Board only relies on information that would reasonably have been available to Meta at the time of its initial review of these posts, noting that the outcome would be the same regardless of how events unfolded after the content was posted. A majority finds the following:

Social and Political Context: The posts were reviewed as HTS-led militias were rapidly gaining territory in an ongoing offensive, the outcome of which was not known, following a protracted conflict that involved widespread human rights abuses and attacks on civilians by various parties to the conflict. The context of the information environment is also relevant. Assad’s regime severely limited press freedom as well as access to information for the majority of Syrian civilians, including arresting and persecuting those who posted anti-regime news or opinions on social media platforms. The Assad regime was using its control of traditional media operations in its territory to routinely suppress information during the conflict. The proscription of many opposition groups on social media, while the government could operate freely, further diminished the space for oppositional voices and created problematic information asymmetries. Syrians living under regime control saw on traditional media a government-curated view of the war, the strength of Assad’s forces and his allies, rather than the strengths and successes of his opponents. Due to these repressive measures, many Syrians (including vulnerable minorities) depended on social media to access and impart alternative sources of information and to find out how the conflict was progressing, whether they might be caught up in it and their overall political and safety situation.

Identity and Status of the Speakers: Those who posted the content are not public figures and appear to have minimal direct influence over their audiences, though the self-identified journalist has a large number of followers. The people depicted in the content are public figures with significant reach and influence due to their positions in HTS.

Intention of Speakers: It is difficult to interpret the intent of those who posted the content, given the lack of substantive commentary in the captions. There is no indication from their posts either way regarding whether they intended to incite violence or endorse the messages from HTS. While the intent of those depicted appears to be primarily to encourage HTS fighters to continue their offensive, including by killing their enemies, this intent should not be transferred directly to those sharing the posts. This is distinct from the Tigray Communication Affairs Bureau decision with respect to the inference of intent, as in that case, incitement was posted by an account officially associated with a warring party, rather than shared from bystanders who are not clearly combatants or speaking for HTS. It is also distinct from the Cambodian Prime Minister case, where the incitement by the head of state was directed at political opposition in the context of an election, to intimidate and suppress the expression and public participation of others. Although the speaker in the Sudan’s Rapid Support Forces Video Captive decision was not identified as one of the warring parties, the caption in the content appeared to endorse the designated entity’s violent message.

In a crisis, when time is of the essence, communication is necessarily truncated. Moreover, in a repressive setting in which few channels of communications exist and there are legitimate concerns of surveillance and scrutiny of social media posts, there may be good reasons to minimize commentary. That contextual clues are less overt or visible to outsiders does not mean they will not be discerned by a poster’s audience, which may well understand the context in which certain information is being shared. In this case, and in any political conflict, people rely on shorthand.

Harm (including its likelihood and imminence): The Board notes that the shared content from HTS was directed not at a general audience, but was providing instruction and encouragement to members of HTS. Neither Mr. al-Shami nor Mr. al-Sharaa were encouraging civilians to directly participate in the hostilities. The references to violence against Assad’s military were unlikely to prompt additional violence from HTS forces and even less likely to prompt additional violence from others. As third parties, the users’ sharing of the content was more likely to inform the public and enable them to keep track of breaking developments in the conflict. Syrian civilians attempting to follow the rapidly developing conflict in the absence of independent media benefited from having access to all available information, including messages from senior leaders of armed groups steering the direction of the conflict. Being able to compare claims from all parties about military developments, as well as HTS’ statement about advancing towards Damascus could form actionable information that civilians would have used to keep themselves safe. The users’ sharing of these posts did not increase the likelihood of HTS members attacking their enemies and removing user expression sharing this information did not materially reduce risks of harm.

In reviewing these contextual factors, a majority finds that the risk of these posts leading to additional violence was outweighed by the need for the public to understand the rapidly evolving situation in Syria, where the regime severely limited information flows. In its public comment to the Board, the civil society organization SMEX, which is based in the region, noted that, while updates from rebel groups often promoted their own aims, they were vital for Syrians to stay informed. In such a situation, removal of the posts did not constitute the least intrusive means to avert potential harm nor was it proportional with respect to the public interest in receiving information about fast-moving developments (PC-31259). This distinguishes this case from the Greece 2023 Elections Campaign decision, where widespread national media coverage ensured alternative ways to access information other than hearing more directly from designated entities or their affiliates.

By providing a counter narrative to the government-controlled media in Syria at the time, the content in both cases provided information that would have been useful to Syrian civilians following the conflict closely to make decisions for their own safety. The video included an update on the rebel offensive that had just started, stating that a “new phase” of the conflict had begun, which would have conveyed to listeners information about an increased incidence of violence. Additionally, the hashtags linked in the caption would allow users to access further content and information on the development of the offensive. The photograph announced the final push of the offensive into Damascus and noted that the rebel forces had taken control of cities from government forces and released prisoners. All of these would have been valuable pieces of information for people living under a repressive regime with no free press. This can be distinguished from the distinct context in the Sudan’s Rapid Support Forces Video Captive decision where the Board came to the opposite conclusion. In that case, there was less information in the post directly relevant to civilians seeking to monitor the progress of the conflict and keep themselves safe. Additional harm was also likely to result from exposing the identity of a prisoner of war and the caption added by the user appeared to endorse the violent message of the designated entity in the content.

Board also notes that Meta’s moderation in the Syrian conflict may have led to information asymmetries that put its users at risk by limiting their access to information. In the context of a repressive regime attempting to completely control access to information during a fast-moving conflict, it is vital that companies like Meta ensure they respect free expression and access to information. The Assad regime’s complete control of traditional media and ability to freely post on social media platforms allowed it to create a highly misleading narrative about the rebel advance. By preventing users from providing information from rebel groups, Meta, even if inadvertently, contributed to this information asymmetry and impeded Syrians’ access to potentially vital information. Meta’s policies also allow calls for violence against designated entities but not calls for violence against regular militaries, even when the latter are implicated in severe human rights violations amounting to crimes against humanity. This asymmetry would have allowed the Assad regime and its supporters to relay communications, including aspirational or conditional threats of violence against HTS, similar armed groups and their supporters on Meta’s platforms, but not the other way around.

A majority finds that, under these circumstances, Meta should have granted a scaled newsworthiness allowance to avoid improper infringement on the public’s right to information. This allowance could be extended to posts that shared the same content with unclear intent, if the accompanying captions did not include any other violations of Meta’s policies. This would ensure that Syrian civilians had access to as much information about the conflict as possible. Of particular relevance here is the lack of access to information at the time, the rapidly evolving conflict situation, and the low likelihood that sharing of this content would lead to additional harm.

A majority also finds that Meta should study the possibility of relaxing the requirement for users to show clear intent to engage in social and political discourse when sharing information about designated entities or official communications from them. This is especially important in the context of a protracted or rapidly evolving conflict, when it is vital that information that could be used to keep people safe is rapidly disseminated, particularly under a repressive regime that controls the traditional media and punishes online dissent.

A minority of the Board disagrees, finding the removal of both posts without applying the newsworthiness allowance consistent with the company’s human rights responsibilities and Board precedent.

For the minority, the Rabat Plan of Action’s factors of context, the speakers’ identity and influence over their audience, and the likelihood of imminent violence, made removal necessary and proportionate. Both posts relay orders to kill from highly influential senior leaders of an armed organization recently involved in serious violations of international law, without commentary. The intended audience are highly compliant militants. A context of information asymmetry and limited media freedom did not reduce the heightened risk of harm to their “enemies,” including civilians. The Board’s precedent directly supports the necessity of removal ( Sudan’s Rapid Support Forces Video Captive; Tigray Communication Affairs Bureau), and minor factual variations between those posts and these do not meaningfully distinguish the underlying human rights analysis. Like the present cases, those conflicts were protracted and involved widespread human rights abuses and attacks on civilians, a tightly controlled information environment and severely limited media freedom. Both featured opposition forces advancing towards government-controlled territory and speeches from prominent leaders or members of armed forces, which contained calls for violence.

For a minority, the posts contained little to no actionable information about conflict developments that would assist civilians in staying safe. While the information environment was limited, the removal of these two posts would not have meaningfully restricted Syrians’ ability to access other real-time and more objective non-violating updates about the conflict on social media. Meta’s human rights responsibilities in an armed conflict do not require it to permit the use of its technologies by persons to relay military communications. For Meta to allow third parties to reshare violent threats or calls to violence from prominent leaders of designated entities without commentary or criticism would essentially allow designated entities to have a presence on the platforms to convey such messages. In the Cambodia Prime Minister decision, the Board warned against Meta using the newsworthiness allowance to permit credible threats of violence in the name of the public’s right to information. The majority’s conclusions here are at odds with that precedent.

The Board finds that strikes and severe penalties imposed on users who share information from designated entities without captions violating any other policy stipulations could be excessive. Therefore, there is a need for users to be better informed about how they can discuss designated entities and individuals and share information from them without violating the policy. This is especially important during times of fast-moving crises when sharing such information in permissible ways becomes vital for people to be better informed about real-life threats and impactful developments around them.

6. The Oversight Board’s Decision

The Board overturns Meta's decisions to take down both posts, requiring them to be restored with a newsworthiness allowance.

7. Recommendations

A. Content Policy

1. To ensure people can access critical information during crises and armed conflicts to help them stay safe, Meta should add a policy lever to the CPP that allows the platform to mitigate information asymmetries its policies may create. This could include policy levers such as: suspending the prohibition on sharing information from designated entities involved in the conflict; suspending strikes or reducing feature limits where content is found violating for unclear intent; providing education to users on how to share information about designated entities in permissible ways. When these policy levers are invoked, the measure must be made public.

The Board will consider this recommendation implemented when Meta shares with the Board both the updated CPP and the resulting criteria for deploying these policy levers in situations of armed conflict.

2. Meta should study, in consultation with impacted stakeholders, how its prohibition on channeling official communications on behalf of a designated entity under the Dangerous Organizations and Individuals policy impacts access to information and protection of civilians against violence in armed conflicts. This study should rely on a detailed qualitative and quantitative analysis of an adequate representative sample of content that has been affected by the relevant part of the Dangerous Organizations and Individuals policy in a selected number of armed conflicts. For example, this can cover a six-month period of relevant content removals from a selected number of conflicts to analyze the trade-offs between content that could have led to harm if it had remained online and the impacts on people’s right to impart and receive information that keeps them better informed in conflict situations.

The Board will consider this recommendation implemented when Meta shares the full study with the Board, including any measures that Meta may take in response to that study.

3. Meta should report to the Board about its efforts in the last five years to assess whether and how its Violence and Incitement and Dangerous Organizations and Individuals Community Standards should be modified to account for International Humanitarian Law (IHL) standards, and set out its near-term future plans in this area, consistent with the UNGPs (Principle 12, commentary), which calls on companies to consider IHL standards in their operations.

The Board will consider this recommendation implemented when Meta shares this information with the Board .

*Procedural Note:

  • The Oversight Board’s decisions are made by panels of five Members and approved by a majority vote of the full Board. Board decisions do not necessarily represent the views of all Members.
  • Under its Charter, the Oversight Board may review appeals from users whose content Meta removed, appeals from users who reported content that Meta left up, and decisions that Meta refers to it (Charter Article 2, Section 1). The Board has binding authority to uphold or overturn Meta’s content decisions (Charter Article 3, Section 5; Charter Article 4). The Board may issue non-binding recommendations that Meta is required to respond to (Charter Article 3, Section 4; Article 4). Where Meta commits to act on recommendations, the Board monitors their implementation.
  • For this case decision, independent research was commissioned on behalf of the Board. The Board was assisted by Duco Advisors, an advisory firm focusing on the intersection of geopolitics, trust and safety, and technology.

Return to Case Decisions and Policy Advisory Opinions