Descripción del caso
The Oversight Board will address the two cases below together, choosing either to uphold or overturn Meta’s decisions on a case-by-case basis.
The Board has selected two cases involving content related to the Syrian group Hay’at Tahrir al-Sham (HTS). In late 2024 HTS, an organization designated as a terrorist group by the UN Security Council, led a military offensive that toppled the Assad regime in Syria. In early 2025 the group’s leader, Ahmed al-Sharaa, became Syria’s interim president. He leads the transitional government, which has ordered the dissolution of HTS and other armed groups in the country.
In the first case, an administrator of a public page posted an image on December 7, 2024, containing a photograph of Ahmed al-Sharaa and Arabic text. The text appears to be an excerpt from a speech given by Sharaa that day. In it he congratulated the group’s revolutionary soldiers for subduing their enemy. He also praised them for releasing prisoners of the Assad regime and replacing “the darkness of injustice and tyranny with the light of justice and dignity.” He encouraged them to keep fighting to liberate Syria and restore people’s rights, urging them to “not waste a single bullet except in the chests of your enemy, for Damascus awaits you.” The day after the content was posted, rebel forces led by HTS took the Syrian capital Damascus and toppled the Assad regime.
In the second case, a user who self-identified as a journalist posted a short video in Arabic to their page on November 28, 2024. The video was of a speech given by Abu Zubair al-Shami, an HTS commander dressed in military fatigues and a face covering. In the speech, he quoted the Quran, cited crimes committed by the Assad regime, celebrated the revolution “of pride and dignity” to “recover rights and remove injustices,” and encouraged rebel soldiers to keep fighting. Additionally, in a section directly addressing Assad’s forces, al-Shami said, “you have no choice but to be killed, flee or defect.”
Meta’s Dangerous Organizations and Individuals policy prohibits glorification, support or representation of terrorist organizations, including those that have been designated by the U.S. government as Foreign Terrorist Organizations (FTO) or Specially Designated Global Terrorists (SDGT). HTS has been designated as both an FTO and an SDGT, while Sharaa has been designated an SDGT. On February 25, 2025, Meta issued internal guidance to allow “content channeling official communications from/on behalf of al-Sharaa exclusively when shared in his official capacity as the interim president of Syria.” The company still removes glorification, support or representation of HTS. Meta’s Violence and Incitement policy prohibits threatening or calling for violence that could lead to death.
Shortly after each piece of content was posted, it was removed by Meta under the Dangerous Organizations and Individuals policy for supporting HTS. Meta later determined that both posts also violated the Violence and Incitement policy. Both users unsuccessfully appealed to Meta before appealing to the Oversight Board. In their appeal to the Board, the user who posted the first piece of content questioned why Meta would ban mention of people fighting for freedom in Syria while “supporting dictatorship” by allowing photos of former president Bashar al-Assad. The user who posted the second piece of content, which was seen by almost 5,000 people in the 15 minutes it was on the platform, explained that, as a journalist, they were trying to inform their audience of factual developments and that Meta’s removal undermines press freedom.
The Board selected these cases to address how Meta’s content moderation impacted freedom of expression in Syria, as people were sharing information from or about a proscribed organization rapidly gaining control over the country during a conflict situation. The cases allow the Board to address how Meta should prevent designated organizations and individuals from using their platforms to cause harm as well as ensure that people are informed of vital developments that may affect their lives. These cases fall within the Board’s Crisis and Conflict strategic priority.
The Board would appreciate public comments that address:
- The impact of Meta’s content moderation in Syria on freedom of expression and security, in particular as designated terrorist groups increased territorial control and, subsequent to these posts, some of its leading members assumed roles in the transitional government.
- The impact of Meta’s content moderation in Syria on vulnerable groups, including religious minorities.
- Media freedom and access to information in Syria, particularly the role of social media and citizen journalism, in the weeks leading up to the overthrow of the Assad regime and the months following.
- The security situation in Syria, with a focus on violations of international human rights law and international humanitarian law in the run-up to and after the fall of the Assad regime.
As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to these cases.
Public Comments
If you or your organization feel you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the button below. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23.59 Pacific Standard Time (PST) on Tuesday 27 May.
What’s Next
Over the next few weeks, Board Members will be deliberating these cases. Once they have reached their decision, we will post it on the Decisions page.
Comentarios
1) Who we are
SMEX is a non-profit that advocates for and advances human rights in digital spaces across West Asia and North Africa (WANA). Our vision is for everyone living in West Asia-North Africa and the diaspora to be able to access and engage with the internet, mobile services, and other networked spaces safely and without fear of censorship, surveillance, or repercussion.
2) Context of the situation in Syria
Since 2011, the Syrian people have demanded freedom and justice after decades of repression. The Assad regime confronted peaceful protests with killing and mass arrests. According to the Syrian Network for Human Rights, the Assad regime killed over 201,290 civilians and forcibly disappeared at least 96,321 individuals since 2011. To this day, their fate remains unknown. The Assad regime has been sanctioned by the US government, the United Nations continues to gather evidence of international crimes committed during the war, and officials from the regime have been charged with, and in some cases successfully prosecuted for war crimes and crimes against humanity.
As the Syrian revolution continued, various groups were formed to confront the Assad regime, some of which were classified as terrorist groups, due to human rights violations and ties to already sanctioned groups. In November 2024, what remained of these groups in Northern Syria launched an operation to take down the Assad regime. During the rebels’ advancement towards the capital Damascus, the leaders of this operation, mainly Ahmad al-Sharaa (Abu Mohamad al-Julani), head of Hayaat Tahrir al-Sham (HTS), sent messages to his fighters, asking them to work with mercy and not revenge. The rebels threatened the Syrian army, led by Bashar al-Assad, to either defect and lay down their weapons, or to face confrontation. Such threats were directed only at armed Assad loyalists, and not religious or ethnic minorities.
On December 8, 2024, the Assad regime fell, after Bashar al-Assad fled Damascus. The speed with which the events took place and the lack of credible information, pushed users to rely on social media for updates, especially since the official Syrian media was allied with the Assad regime, and the rebel groups used their own channels for official communication and announcements. These updates normally promoted the rebels and their campaign, but also provided context and details on their advancement and operations. Such information was vital to audiences who wanted to stay informed on the unfolding events. Despite the fact that Bashar al-Assad is internationally sanctioned for committing war crimes, his appearance and media messages are not usually subject to content moderation on Meta’s platforms. The removal of content related to the rebels’ advancement was seen as a biased practice and lacked vital sensitivity to the context of Syrian political developments.
Since the new authorities assumed power, they have made efforts to unify militias opposed to the Assad regime under the new ministry of defense, with varying degrees of success. Some of the armed groups that either integrated in the new army, or refused to, took part in violence against Syrian minorities, mainly Druze and Alawite communities. Other minorities such as Kurds and Christians are also subject to sectarian incitement online. According to Amnesty, government affiliated groups targeted Alawite civilians in the Syrian coast in March 2025, after Assad-allied groups attempted to stage a coup. Similarly, attacks against Druze-majority areas in Rif Dimashq and near Sweida were fueled by sectarian incitement, after the circulation of an unsourced and unverified voice message insulting Prophet Mohamad, according to the Syrian Observatory for Human Rights.
Threats of online sectarian and extremist rhetoric manifesting into real-life violence in Syria are founded and posts on social media can escalate such attacks. Such content that clearly incites against minorities in Syria should be moderated and kept in check. However, circulating news and unfolding updates about political developments in Syria should not be censored under the pretext of content moderation. This will only be possible if Meta invests in engagement with impacted communities, and uses that engagement to directly build the cultural literacy of its automated and human moderation systems. For example, Meta teams need to develop mechanisms to monitor incitement, including implicit sectarian rhetoric, such as the use of minority-related slurs, insulting emojis, and references to locations and areas in Syria that are home to minority communities.
3) Policy and exceptions at time this content was posted
Meta’s Dangerous Organizations and Individuals (DOI) and Violence and Incitement policies are intended “to prevent and disrupt real-world harm” and “to prevent potential offline violence that may be related to content on our platforms,” respectively. However, in the context of Syria, and the WANA region more broadly, these policies have resulted in the over-removal of content that is vital for public understanding, documentation, and debate. The Board has reaffirmed over and over again the failures of Meta’s moderation in Arabic and in the WANA region, as well as the importance of free expression on Meta platforms, including during times of crisis and conflict. Through case decisions allowing a variety of types of content, even at times content that is much more directly linked to offline harm than the posts in this case, the Board has made it clear that removal should be limited to the most extreme cases. The Board has also stressed the importance of newsworthiness, including with regards to content that would otherwise be removed under the Violence and Incitement and DOI policies.
- Lack of evidence regarding harmful intent or affiliation
Unfortunately, the case announcement here includes very little information about the accounts that shared these videos, in particular the public page that shared the photo and speech text of Ahmed al-Sharaa. We know that the second case in this bundle was shared by a user that self-identified as a journalist. We don’t know what kind of public page shared the first video, and whether it was a page that identified itself as a source of news. Regardless, there is no indication that these are accounts that are dedicated to promoting HTS or inciting violence.
- The content in both cases does not violate the DOI and V&I policies
According to the case summary, Meta initially removed the content under its Dangerous Organizations and Individuals (DOI) policy and later said the Violence and Incitement policy also applied. We believe that neither policy was correctly applied here.
We argue that while the Violence and Incitement policy might be considered relevant, it contains clear exceptions that apply in this case. Specifically, the policy permits content “when shared in an awareness-raising or condemning context.” Furthermore, the policy also allows “aspirational or conditional threats of violence... directed at terrorist or other violent actors.” Although Meta has refused to publish its opaque Dangerous Organizations and Individuals list, Bashar al-Assad and the Syrian Army are well-documented violent actors.
- The Dangerous Organization and Individuals Policy:
The DOI policy should not have been applied to this content. It clearly falls under the policy’s exception for content that “reports on, neutrally discusses, or condemns dangerous organizations and individuals or their activities.” Even if it was a technical violation, it still qualifies under the DOI policy’s exception, which “recognize[s] that users may share content that includes references to designated dangerous organizations and individuals in the context of social and political discourse.”
According to the Board’s own analysis in the Shared Al Jazeera and Mention of the Taliban in News Reporting cases, sharing information about such groups does not automatically amount to praise, support, or representation - even when the shared information includes what could be considered a conditional threat of violence as in the Shared Al Jazeera case. The current version of the DOI policy only prohibits content that constitutes glorification, support, or representation (terms which lack precise definitions in the policy text). There is also no clear indication in the case description that the content explicitly glorified HTS or any other designated group. In line with Meta’s stated exceptions, “neutrally discussing” designated entities is explicitly allowed.
- The Violence and Incitement policy
It is questionable whether the V&I policy applies at all. But even if it does, the content qualifies under clearly stated exceptions. Meta’s policy states “We do not prohibit threats when shared in an awareness-raising or condemning context, when less severe threats are made in the context of contact sports, or certain threats against violent actors, like terrorist groups.” and, “In some cases, we see aspirational or conditional threats of violence, including expressions of hope that violence will be committed, directed at terrorists and other violent actors (e.g., ‘Terrorists deserve to be killed,’ ‘I hope they kill the terrorists’). We deem those non-credible, absent specific evidence to the contrary.”
In this case, the content consisted of conditional, non-specific threats aimed at violent actors, the Assad regime, and thus aligns with the permitted examples in Meta’s policy. This is supported by the Shared Al Jazeera case, where a user reshared statements from designated organizations with very clear (but conditional) statements about violence, “The resistance leadership in the common room gives the occupation a respite until 18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood in Jerusalem, otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman.” Similar to that case, here both videos did not encourage violence directly.
4) The removal of this content reflects Meta’s ongoing failures moderating content in conflict zones and content in Arabic or relating to socio-political issues in the SWANA region
These removals reflect systemic over-enforcement, disregard for contextual exceptions, and unequal application of policies toward state vs. non-state actors. Oversight Board precedents, Meta’s policy frameworks, and well-documented impacts of content moderation on human rights in the region support a finding that the content in question falls squarely within permissible exceptions for neutral reporting, awareness-raising, and conditional threats against violent actors.
Meta’s DOI policy and the automation used to enforce it have always resulted in over moderation of Arabic content, as well as content from Arabic speaking regions and content even commenting on socio-political issues in the WANA region. The Board recognized this in its policy advisory opinion (PAO) on how Meta moderates content that uses the word “shaheed” to refer to designated dangerous individuals (hereinafter referred to as the “Shaheed Decision”). But it’s not just the Board: Meta itself admitted that it was overmoderating content in its request for a PAO, where it said it “may be over-enforcing on significant amounts of speech not intended to praise a designated individual, particularly among Arabic speakers.” Human rights assessments have shown the same. A 2022 report by Business for Social Responsibility (BSR), commissioned by Meta in response to a Board recommendation, clearly states that Meta overmoderates content in Arabic compared to content in Hebrew. The recommendations from this report have still not been effectively enforced. The excessive application of policies regarding terrorist content or violent organizations has inadvertently resulted in the removal of lawful content from Muslim and Arabic-speaking communities. This constitutes an infringement upon their rights to nondiscrimination, freedom of expression, assembly, and association.
We also note that Meta’s failures extend to content moderation during times of crisis and conflict. As the Board noted in a very recent case, “Posts supporting Riots in the UK,” delays in instituting Meta’s Crisis Policy Protocol (CPP) can allow incitement to violence to proliferate unchecked. We are concerned about how this will play out in the rapidly shifting environment of post-Assad Syria, where violence has the potential to flare up quickly, with devastating results. Additionally, we are concerned that information about when and why Meta activates the CPP appears to only be available through Board decisions or case announcements, such as the aforementioned case about the UK riots in 2024. Similarly, information about how Meta is handling content in Syria, particularly as President Trump moves to normalize relations and lift sanctions, is not being made public by the company. Instead, valuable information such as the fact that “content channeling official communications from/on behalf of al-Sharaa exclusively when shared in his official capacity as the interim president of Syria” will be allowed on Meta platforms was issued as internal guidance. We thank the Board for sharing this information in the case announcement for this case, but it should have been made public by Meta. It is also not clear whether al-Sharaa remains on the DOI list or whether Meta intends to delist him. More generally, although the Board does appear to have some insight into the criteria for initiating the CPP, that information should be available for all.
We appreciate the many recommendations from the Board addressing these failures, many of which echo recommendations made by regional and international civil society and the UN over the years. The Board’s interventions have increased transparency, but unfortunately Meta’s moderation itself doesn’t appear to have improved significantly. We note that the Board continues to issue summary decisions addressing incorrect removals of content in Arabic and related to the SWANA region, for example “Footage of Massacres in Syria,” “Link to Wikipedia Article on Hayat Tahrir Al-Sham,” and “Reports on the War in Gaza,” All of these decisions were issued after the Board’s Shaheed decision.
5) Recommendations
We urge the Board to restore the content on Meta platforms, and to use this opportunity to implement further improvements. We encourage the Board to take into consideration the following recommendations :
- Engage Impacted Communities for Cultural Context: we urge the Board to ask Meta to work with people from impacted Syrian communities to ensure appropriate cultural context and understanding. To better identify and address genuinely harmful content, collaboration with individuals from affected communities is essential for enhanced understanding.
- Strengthen the Violence and Incitement policy: The current policy addresses explicit threats of low-severity violence against “Protected Characteristic” groups, but requires additional context to act on “veiled or implicit” threats. We recommend that the inclusion of a protected group in any veiled or implicit threat be recognized as relevant context, and that the policy clearly states this. Achieving this will require Meta to deepen its understanding of cultural contexts to recognize such threats effectively.
- Increased Transparency Around Crisis Policy Protocol (CPP): We recommend that Meta publishes more information about the CPP on the Transparency page. There is a need for greater clarity regarding when and how the CPP is activated, particularly in cases involving the targeting of minority groups The Board should reiterate its recommendation in the UK Riots case that Meta “revise the criteria it has established to initiate the Crisis Policy Protocol,” and should specifically note that those criteria should include targeting of protected characteristic groups, including religious and ethnic minorities.
- Public Notification of Delisting from the DOI List: we recommend that Meta publicly announces when a group or individual has been delisted from the DOI list.
Am of the opinion that the united nations are not united and lots of jobs needs to be done in the unity and measures either sanctions for any culprit country that goes against nor underminning any measns of resolution or agreed terms to hault regions / areas / country experiencing unrest or at war, my need to save lifes now than going into third world war, may God use United state president to calm ever unrest zones, his being doing wonders anyway.