More Transparency Needed Over Banned Symbols Linked to Hate Groups
June 12, 2025
The Oversight Board has considered three cases involving symbols often used by hate groups, but which can also have other uses. The Board calls on Meta to explain how it creates and enforces its designated symbols list under its Dangerous Organizations and Individuals Community Standard. This will provide greater transparency for users.
The Board is concerned about the potential overenforcement of references to designated symbols. Meta should develop a system to automatically flag when non-violating content is being removed in large volume.
The Board upholds Meta's decisions to remove violating content in two of the cases and to leave the content up in the third case.
About the Cases
Meta referred to the Board three Instagram posts involving symbols often used by hate groups, but which can have other uses. The first post involved an image showing a woman, with the words “Slavic Army” and a Kolovrat symbol superimposed on her face covering. In the caption, the user expressed pride in being Slavic and hoped their “people will wake up.”
The second post was a carousel of photographs of a woman wearing an iron cross necklace with a swastika on it and a T-shirt with an AK-47 assault rifle and “Defend Europe”, written in Fraktur font, printed on it. The caption contained the Odal (or Othala) rune, the hashtag #DefendEurope, symbols outlining an M8 rifle and heart emojis.
On referring the two posts to the Board, Meta removed them because they violated the Dangerous Organizations and Individuals policy.
The third post is a carousel of drawings of an Odal rune wrapped around a sword with a quotation about blood and fate by Ernst Jünger, a German author, philosopher and soldier. The caption repeats the quotation, shares a selective early history of the rune and states that prints of the image are for sale. Meta concluded that this post does not breach any of its rules.
Key Findings
The majority of the Board finds that the Kolovrat symbol post glorified white nationalism. A minority disputes any automatic link between Slavic pride and white nationalism. The Board finds the defend Europe post glorified white supremacy. The two posts should be taken down for violating the Dangerous Organizations and Individuals policy. Under the policy rationale, Meta removes content that glorifies, supports or represents ideologies that promote hate. It designates white nationalism and white supremacy as hateful ideologies.
The quotation post does not violate the same policy. It describes the Odal rune in a seemingly neutral manner. There is no glorification of any hateful ideology in the quotation. The post does not reference Nazism or any other designated hateful ideology specifically.
Meta’s decisions were consistent with its human rights responsibilities. For the majority, the Kolovrat symbol post’s contextual cues, including clear references to Slavic nationalism and Slavic army, may be read as urging followers to take potentially violent action, and it should be removed. A minority disagrees, considering that the post did not pose a direct risk of inciting imminent or likely harm.
The Board considers the removal of the defend Europe post necessary and proportionate to prevent risk of immediate discrimination. It contains multiple contextual cues glorifying designated hateful ideologies. Its removal is necessary and proportionate to the legitimate aim of preventing Meta’s platforms from being abused to organize and incite violence or exclusion.
Leaving up the quotation post was justified, as the content does not reference a designated hateful ideology and provides more context around the user’s artwork.
The Board reiterates concerns about the lack of transparency around designation processes under Tier 1 of the Dangerous Organizations and Individuals policy, as this makes it challenging for users to understand which entities, ideologies and related symbols they can share. Meta should provide more transparency around designated symbols, especially those associated with designated hate entities or ideologies, establishing an evidence-based, global and iterative process. It should publish a clear explanation of its processes and criteria for designating the symbols and enforcement against them.
The Board is concerned about potential overenforcement of references to designated symbols. Meta does not collect sufficiently granular data on its enforcement practices in this area. Meta told the Board that its internal definition of a “reference” is broader than the definition of “unclear reference” in its public-facing policy. Meta should publicly provide the internal definition of “references” and define its subcategories, for clarity and accessibility for users.
The Oversight Board’s Decision
The Oversight Board upholds Meta's decisions to take down the content in the first and second cases and to leave up the content in the third case.
The Board recommends that Meta:
- Make public the internal definition of “references” and define its subcategories under the Dangerous Organizations and Individuals Community Standard.
- Introduce a process to determine how designated symbols are added to the groups and which group each designated symbol is added to, and periodically audit all designated symbols, ensuring the list covers all relevant symbols globally and removing those no longer satisfying published criteria.
- Develop a system to automatically identify and flag instances where designated symbols lead to “spikes” that suggest a large volume of non-violating content is being removed.
- Publish a clear explanation of how it creates and enforces its designated symbols list under the Dangerous Organizations and Individuals Community Standard.
Further Information
To read public comments for this case, click here.