OVERTURNED
2021-005-FB-UA

“Two buttons” meme

The Oversight Board has overturned Facebook's decision to remove a comment under its Hate Speech Community Standard.
OVERTURNED
2021-005-FB-UA

“Two buttons” meme

The Oversight Board has overturned Facebook's decision to remove a comment under its Hate Speech Community Standard.
Policies and topics
Freedom of expression, Humor, Politics
Cruel and insensitive
Region and countries
United States & Canada
United States
Platform
Facebook
Policies and topics
Freedom of expression, Humor, Politics
Cruel and insensitive
Region and countries
United States & Canada
United States
Platform
Facebook

Please note that this decision is available in both Turkish (via the ‘language’ tab accessed through the menu at the top of this screen) and Armenian (via this link).

Որոշման ամբողջական տարբերակը հայերենով կարդալու համար սեղմեք այստեղ.

Case summaryCase summary

The Oversight Board has overturned Facebook’s decision to remove a comment under its Hate Speech Community Standard. A majority of the Board found it fell into Facebook’s exception for content condemning or raising awareness of hatred.

About the case

On December 24, 2020, a Facebook user in the United States posted a comment with an adaptation of the ‘daily struggle’ or ‘two buttons’ meme. This featured the split-screen cartoon from the original ‘two buttons’ meme, but with a Turkish flag substituted for the cartoon character’s face. The cartoon character has its right hand on its head and appears to be sweating. Above the character, in the other half of the split-screen, are two red buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.”

While one content moderator found that the meme violated Facebook’s Hate Speech Community Standard, another found it violated its Cruel and Insensitive Community Standard. Facebook removed the comment under the Cruel and Insensitive Community Standard and informed the user of this.

After the user’s appeal, however, Facebook found that the content should have been removed under its Hate Speech Community Standard. The company did not tell the user that it upheld its decision under a different Community Standard.

Key findings

Facebook stated that it removed the comment as the phrase “The Armenians were terrorists that deserved it,” contained claims that Armenians were criminals based on their nationality and ethnicity. According to Facebook, this violated its Hate Speech Community Standard.

Facebook also stated that the meme was not covered by an exception which allows users to share hateful content to condemn it or raise awareness. The company claimed that the cartoon character could be reasonably viewed as either condemning or embracing the two statements featured in the meme.

The majority of the Board, however, believed that the content was covered by this exception. The ‘two buttons’ meme contrasts two different options not to show support for them, but to highlight potential contradictions. As such, they found that the user shared the meme to raise awareness of and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying these same historic atrocities. The majority noted a public comment which suggested that the meme, “does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it.” The majority also believed that the content could be covered by Facebook’s satire exception, which is not included in the Community Standards.

The minority of the Board, however, found that it was not sufficiently clear that the user shared the content to criticize the Turkish government. As the content included a harmful generalization about Armenians, the minority of the Board found that it violated the Hate Speech Community Standard.

In this case, the Board noted that Facebook told the user that they violated the Cruel and Insensitive Community Standard when the company based its enforcement on the Hate Speech Community Standard. The Board was also concerned about whether Facebook’s moderators had the necessary time and resources to review content containing satire.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the comment be restored.

In a policy advisory statement, the Board recommends that Facebook:

  • Inform users of the Community Standard enforced by the company. If Facebook determines that a user’s content violates a different Community Standard to the one the user was originally told about, they should have another opportunity to appeal.
  • Include the satire exception, which is not currently available to users, in the public language of its Hate Speech Community Standard.
  • Adopt procedures to properly moderate satirical content while taking into account relevant context. This includes providing content moderators with access to Facebook’s local operation teams and sufficient time to consult with these teams to make an assessment.
  • Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.
  • Make sure appeals based on policy exceptions are prioritized for human review.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decisionFull case decision

1. Decision summary

The Oversight Board has overturned Facebook’s decision to remove content under its Hate Speech Community Standard. A majority of the Board found that the cartoon, in the form of a satirical meme, fell into the Hate Speech Community Standard’s exception for content that condemns hatred or raises awareness of it.

2. Case description

On December 24, 2020, a Facebook user in the United States posted a comment with an adaption of the “daily struggle” or “two buttons” meme. A meme is a piece of media, which is often humorous, that spreads quickly across the internet. This featured the same-split screen cartoon from the original meme, but with the Turkish flag substituted for the cartoon character’s face. The cartoon character has its right hand on its head and appears to be sweating. Above the character, in the other half of the split-screen, there are two red buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.” The meme was preceded by a "thinking face" emoji.

The comment was shared on a public Facebook page that describes itself as forum for discussing religious matters from a secular perspective. It responded to a post containing an image of a person wearing a niqab with overlay text in English: “Not all prisoners are behind bars.” At the time the comment was removed, that original post it responded to had 260 views, 423 reactions and 149 comments. A Facebook user in Sri Lanka reported the comment for violating the Hate Speech Community Standard.

Facebook removed the meme on December 24, 2020. Within a short period of time, two content moderators reviewed the comment against the company’s policies and reached different conclusions. While the first concluded that the meme violated Facebook’s Hate Speech policy, the second determined that the meme violated the Cruel and Insensitive policy. The content was removed and logged in Facebook’s systems based on the second review. On this basis, Facebook notified the user that their comment “goes against our Community Standard on cruel insensitive content.”

After the user’s appeal, Facebook upheld its decision but found that the content should have been removed under its Hate Speech policy. For Facebook, the statement “The Armenians were terrorists that deserved it” specifically violated the prohibition on content claiming that all members of a protected characteristic are criminals, including terrorists. No other parts of the content, such as the claim that the Armenian genocide was a lie, were deemed to be violating. Facebook did not inform the user that it upheld the decision to remove their content under a different Community Standard.

The user submitted their appeal to the Oversight Board on December 24, 2020.

Lastly, in this decision, the Board referred to the atrocities committed against the Armenian people from 1915 onwards as genocide, as this term is commonly used to describe the massacres and mass deportations suffered by Armenians and it is also referred to in the content under review. The Board does not have the authority to legally qualify such atrocities and this qualification is not the subject of this decision.

3. Authority and scope

The Board has authority to review Facebook’s decision following an appeal from the user whose post was removed (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). The Board may uphold or reverse that decision (Charter Article 3, Section 5). The Board’s decisions are binding and may include policy advisory statements with recommendations. These recommendations are non-binding, but Facebook must respond to them (Charter Article 3, Section 4). The Board is an independent grievance mechanism to address disputes in a transparent and principled manner.

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

Facebook's Community Standards define hate speech as “a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.” Under “Tier 1,” prohibited content (“do not post”) includes content targeting a person or group of people on the basis of a protected characteristic with:

  • “dehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form) to or about […] criminals (including, but not limited to, “thieves”, “bank robbers”, or saying “All [protected characteristic or quasi-protected characteristic] are ‘criminals’”).”
  • speech “[m]ocking the concept, events or victims of hate crimes even if no real person is depicted in an image.”
  • speech “[d]enying or distorting information about the Holocaust.”

However, Facebook allows “content that includes someone else’s hate speech to condemn it or raise awareness.” According to the Hate Speech Community Standard’s policy rationale, “speech that might otherwise violate our standards can be used self-referentially or in an empowering way. Our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.”

Additionally, the Board noted Facebook’s Cruel and Insensitive Community Standard which forbids content that targets “victims of serious physical or emotional harm,” including “attempts to mock victims […] many of which take the form of memes and GIFs.” This policy prohibits content (“do not post”) that “contains sadistic remarks and any visual or written depiction of real people experiencing premature death.”

II. Facebook’s values

Facebook’s values are outlined in the introduction to the Community Standards. The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Facebook limits “Voice” in service of four values, and two are relevant here:

“Safety”: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.

“Dignity” : We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

II. Human rights standards

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Facebook announced its Corporate Human Rights Policy, where it committed to respecting rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:

  1. Freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression reports: A/HRC/35/22/Add.3 (2017), A/HRC/41/35/Add.2 (2019), A/HRC/38/35 (2018), A/74/486 (2019), and A/HRC/44/49/Add.2 (2020); the Rabat Plan of Action, OHCHR, (2013).
  2. The right to non-discrimination: Article 2, para. 1, ICCPR; Articles 1 and 2, Convention on the Elimination of All Forms of Racial Discrimination (CERD).
  3. The right to be informed in the context of access to justice: Article 14, para. 3(a), ICCPR; General Comment No. 32, Human Rights Committee, (2007).

5. User statement

The user stated in their appeal to the Board that “historical events should not be censored.” They noted that their comment was not meant to offend but to point out “the irony of a particular historical event.” The user noted that “perhaps Facebook misinterpreted this as an attack.” The user further stated that even if the content invokes “religion and war,” it is not a “hot button issue.” The user found Facebook and its policies overly restrictive and argued that “[h]umor like many things is subjective and something offensive to one person may be funny to another.”

6. Explanation of Facebook’s decision

Facebook explained that it removed the comment as a Tier 1 attack under the Hate Speech Community Standard, specifically for violating its policy prohibiting content alleging that all members of a protected characteristic are criminals, including terrorists. According to Facebook, while the first statement in the meme “The Armenian Genocide is a lie” is a negative generalization, it did not directly attack Armenians and thus did not violate the company’s Community Standards. Facebook found that the second statement “The Armenians were terrorists that deserved it” directly attacked Armenians by alleging that they are criminals based on their ethnicity and nationality. This violated the company’s Hate Speech policy.

In its decision rationale, Facebook assessed whether the exception for content that shares hate speech to condemn it or raise awareness of it should apply in this case. Facebook argued that the meme did not fall into this exception, as the user was not clear they intended to condemn hate speech. Specifically, Facebook explained to the Board that the sweating cartoon character in the meme could be reasonably viewed as either condemning or embracing the statements. Facebook also explained that its Hate Speech policy previously included an exception for humor. The company clarified that it removed this exception in response to a Civil Rights Audit report (July 2020) and as part of its policy development. In its response to the Board, Facebook claimed that “creating a definition for what is perceived to be funny was not operational for Facebook’s at-scale enforcement.” However, in the Civil Rights Audit report, the company disclosed it maintained a narrower exception for satire which Facebook defines as content that “includes the use of irony, exaggeration, mockery and/or absurdity with the intent to expose or critique people, behaviors, or opinions, particularly in the context of political, religious, or social issues. Its purpose is to draw attention to and voice criticism about wider societal issues.” This exception is not included in its Community Standards. It appears to be separate from the exception for content that includes hate speech to condemn it or raise awareness of it.

Facebook also clarified that the content did not violate the Cruel and Insensitive policy, which prohibits “explicit attempts to mock victims,” including through memes, because it did not depict or name a real victim.

Facebook also stated that its removal of the content was consistent with its values of “Dignity” and “Safety,” when balanced against the value of “Voice.” According to Facebook, content that calls the Armenian people terrorists “is an affront to their dignity, can be experienced as demeaning or dehumanizing, and can even create risks of offline persecution and violence.”

Facebook argued that its decision was consistent with international human rights standards. Facebook stated that (a) its policy was “easily accessible” in the Community Standards, (b) the decision to remove the content was legitimate to protect “the rights of others from harm and discrimination,” and (c) its decision to remove the content was “necessary and proportionate to limit harm against Armenians.” To ensure that limits on expression were proportionate, Facebook argued that its Hate Speech policy applied to “a narrow set of generalizations.”

7. Third-party submissions

The Oversight Board received 23 public comments related to this case. Four of the comments were from Europe, one from Middle East and North Africa and 18 from United States and Canada.

The Board received comments from parties directly connected to issues of interest for this case. These included a descendant of victims of the Armenian genocide, organizations that study the nature, causes and consequences of genocide, as well as a former content moderator.

The submissions covered themes including: the meaning and use of the “daily struggle” or “two buttons” meme as adapted by the user in this case, whether the content was intended as a political critique of the Turkish government and its denial of the Armenian genocide, whether the content was mocking the victims of the Armenian genocide, and how Facebook’s Hate Speech and Cruel and Insensitive Community Standards relate to this case.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looked at the question of whether this content should be restored through three lenses: Facebook’s Community Standards; the company’s values; and its human rights responsibilities.

8.1 Compliance with Community Standards

The Board analyzed each of the two statements against Facebook’s Community Standards, before examining the effect of juxtaposing these statements in this version of the “daily struggle” or “two buttons” meme.

8.1.1. Analysis of the statement “The Armenian Genocide is a lie”

The Board noted that Facebook did not find this statement to violate its Hate Speech Community Standard. Facebook enforces its Hate Speech Community Standard by identifying (i) a “direct attack,” and (ii) a “protected characteristic” the direct attack was based upon. The policy rationale lists “dehumanizing speech” as an example of an attack. Ethnicity and national origin are included among the list of protected characteristics.

Under the “do not post” section of its Hate Speech policy, Facebook prohibits speech “[m]ocking the concept, events or victims of hate crimes even if no real person is depicted in an image.” A majority of the Board noted, however, that the user’s intent was not to mock the victims of the events referred to in the statement, but to use the meme, in the form of satire, to criticize the statement itself. For the minority, the user’s intent was not sufficiently clear. The user could be sharing the content to embrace the statement rather than to refute it.

In this case, Facebook notified the user that their content violated the Cruel and Insensitive Community Standard. Under this policy, Facebook prohibits “attempts to mock victims [of serious physical or emotional harm],” including content that “contains sadistic remarks and any visual or written depiction of real people experiencing premature death.” The Board noted but did not consider Facebook’s explanation that this policy is not applicable to this case because the meme does not depict or name the victims of the events referred to in the statement.

Under the “do not post” section of its Hate Speech policy, Facebook also prohibits speech “[d]enying or distorting information about the Holocaust.” The Board noted the company’s explanation that this policy does not apply to the Armenian genocide or other genocides, and that this policy was based on the company’s “ consultation with external experts, the well-documented rise in anti-Semitism globally, and the alarming level of ignorance about the Holocaust.”

8.1.2. Analysis of the statement “The Armenians were terrorists that deserved it”

The Board noted that Facebook found this statement to violate its Hate Speech Community Standard. The “do not post” section of this Hate Speech Community Standard prohibits “[d]ehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form).” The policy includes speech that portrays the targeted group as “criminals.” The Board believed the term “terrorists” fell into this category.

8.1.3 Analysis of the combined statements in the meme

The Board is of the view that one should evaluate the content as a whole, including the effect of juxtaposing these statements in a well-known meme. A common purpose of the “daily struggle” or “two buttons” meme is to contrast two different options to highlight potential contradictions or other connotations, rather than to indicate support for the options presented.

For the majority, the exception to the Hate Speech policy is crucial. This exception allows people to “share content that includes someone else’s hate speech to condemn it or raise awareness.” It also states: “our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.” The majority noted that the content could also fall under the company’s satire exception, which is not publicly available.

Assessing the content as a whole, the majority found that the user’s intent was clear. They shared the meme as satire to raise awareness about and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying the same historic atrocities. The user’s intent was not to mock the victims of these events, nor to claim those victims were criminals or that the atrocity was justified. The majority took into account the Turkish government’s position on genocide suffered by Armenians from 1915 onwards ( Republic of Turkey, Ministry of Foreign Affairs) as well as the history between Turkey and Armenia. In this context, they found that the cartoon character’s sweating face replaced with a Turkish flag and the content’s direct link to the Armenian genocide, meant the user shared the meme to criticize the Turkish government’s position on this issue. The use of the "thinking face" emoji, which is commonly used sarcastically, alongside the meme, supports this conclusion. The majority noted public comment “PC-10007” (made available under section 7 above), which suggested that “this meme, as described, does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it.” It would thus be wrong to remove this comment in the name of protecting Armenians, when the post is a criticism of the Turkish government, in support of Armenians.

As such, the majority found that, taken as a whole, the content fell within the policy exception in Facebook’s Hate Speech Community Standard. For the minority, in the absence of specific context, the user’s intent was not sufficiently clear to conclude that the content was shared as satire criticizing the Turkish government. Additionally, the minority found that the user was not able to properly articulate what the alleged humor intended to express. Given the content includes a harmful generalization against Armenians, the minority found that it violated the Hate Speech Community Standard.

8.2 Compliance with Facebook’s values

A majority of the Board believed that restoring this content is consistent with Facebook’s values. The Board recognized the Armenian community’s sensitivity to statements concerning the mass-scale atrocities suffered by Armenians from 1915 onwards, as well as the community’s long struggle to seek recognition of the genocide and justice for these atrocities. However, the majority does not find any evidence that the meme in this case posed a risk to “Dignity” and “Safety” that would justify displacing “Voice.” The majority also noted Facebook’s broad reference to “Safety,” without explaining how this value was applied in this case.

The minority found that while satire should be protected, as the majority rightly stated, the statements in the comment damage the self-respect of people whose ancestors suffered genocide. The minority also found the statements to be disrespectful of the honor of those who were massacred and harmful, as it could increase the risk of discrimination and violence against Armenians. This justified displacing “Voice” to protect “Safety” and “Dignity.”

8.3 Compliance with Facebook’s human rights responsibilities

Freedom of expression (Article 19 ICCPR)

Article 19, para. 2 of the ICCPR provides broad protection for expression of “all kinds,” including written and non-verbal “political discourse,” as well as “cultural and artistic expression.” The UN Human Rights Committee has made clear the protection of Article 19 extends to expression that may be considered “deeply offensive” (General Comment No. 34, paras. 11, 12).

In this case, the Board found that the cartoon, in the form of a satirical meme, took a position on a political issue: the Turkish government’s stance on the Armenian genocide. The Board noted that “cartoons that clarify political positions” and “memes that mock public figures” may be considered forms of artistic expression protected under international human rights law (UN Special Rapporteur on freedom of expression, report A/HRC/44/49/Add.2, at para. 5). The Board further emphasized that the value placed by the ICCPR upon uninhibited expression concerning public figures in the political domain and public institutions “is particularly high” (General Comment No. 34, para. 38).

The Board also noted that laws establishing general prohibitions of expressions with incorrect opinions or interpretations of historical facts, often justified through references to hate speech, are incompatible with Article 19 of the ICCPR, unless they amount to incitement of hostility, discrimination or violence under Article 20 of the ICCPR (General Comment 34, para. 29; UN Special Rapporteur on freedom of expression, report A/74/486, at para. 22).

While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook should seek to align its content moderation policies on hate speech with these principles (UN Special Rapporteur on freedom of expression, report A/74/486, at para. 58(b)).

I. Legality

Any rules restricting expression must be clear, precise, and publicly accessible (General Comment 34, para. 25). Individuals must have enough information to determine if and how their speech may be limited, so that they can adjust their behavior accordingly. Facebook’s Community Standards “permit content that includes someone else’s hate speech to condemn it or raise awareness,” but ask users to “clearly indicate their intent.” In addition, the Board also noted that Facebook removed an exception for humor from its Hate Speech policy following a Civil Rights Audit concluded in July 2020. While this exception was removed, the company kept a narrower exception for satire that is currently not communicated to users in its Hate Speech Community Standard.

The Board also noted that Facebook wrongfully reported to the user that they violated the Cruel and Insensitive Community Standard, when Facebook based its enforcement on the Hate Speech policy. The Board found that it is not clear enough to users that the Cruel and Insensitive Community Standard only applies to content that depicts or names victims of harm.

Additionally, the Board found that properly notifying users of the reasons for enforcement action against them would help users follow Facebook’s rules. This relates to the legality issue, as the lack of relevant information for users subject to content removal “creates an environment of secretive norms, inconsistent with the standards of clarity, specificity and predictability” which may interfere with “the individual’s ability to challenge content actions or follow up on content-related complaints.” (UN Special Rapporteur on freedom of expression, report A/HCR/38/35, at para. 58). Facebook’s approach to user notice in this case therefore failed the legality test.

II. Legitimate aim

Any restriction on freedom of expression should also pursue a “legitimate aim.” The Board agreed the restriction pursued the legitimate aim of protecting the rights of others (General Comment No. 34, para. 28). These include the rights to equality and non-discrimination, including based on ethnicity and national origin (Article 2, para. 1, ICCPR; Articles 1 and 2, ICERD).

The Board also reaffirmed its finding in case decision 2021-002-FB-UA that “it is not a legitimate aim to restrict expression for the sole purpose of protecting individuals from offense (UN Special Rapporteur on freedom of expression, report A/74/486, para. 24), as the value international human rights law placed on uninhibited expression is high (General Comment No. 34, para. 38).”

III. Necessity and proportionality

Any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment 34, para. 34).

The Board assessed whether the content removal was necessary to protect the rights of Armenians to equality and non-discrimination. The Board noted that freedom of expression currently faces substantial restrictions in Turkey, with disproportionate effects on ethnic minorities living in the country, including Armenians. In a report on his mission to Turkey in 2016, the UN Special Rapporteur on freedom of expression found censorship to be operating in “all the places that are fundamental to democratic life: the media, educational institutions, the judiciary and the bar, government bureaucracy, political space and the vast online expanses of the digital age” (UN Special Rapporteur on freedom of expression, report A/HRC/35/22/Add.3, at para. 7). In the follow-up report published in 2019, the UN Special Rapporteur mentioned that the situation had not improved (UN Special Rapporteur on freedom of expression, report A/HRC/41/35/Add.2, at para. 26).

Turkish authorities have specifically targeted expression denouncing the atrocities committed by the Turkish Ottoman Empire against Armenians from 1915 onwards. In a joint allegation letter, a number of UN special procedures mentioned that Article 301 of the Turkish Criminal Code appears to constitute “a deliberate effort to obstruct access to the truth about what appears to be policy of violence directed against the Turkish Armenian community” and “the right of victims to justice and reparation.” The Board also noted the assassination, in 2007, of Hrant Dink, a journalist of Armenian origin who published a number of articles on the identity of Turkish citizens of Armenian origin. In one of these articles, Dink discussed the lack of recognition of the genocide and how this affects the identity of Armenians. Dink was previously found guilty of demeaning the “Turkish identity” through his writing by Turkish courts. In 2010, the European Court of Human Rights concluded that the verdict of Dink and the failure of Turkish authorities to take the appropriate measures to protect his life amounted to a violation of his freedom of expression (see European Court of Human Rights, Dink v Turkey, para. 139).

A majority of the Board concluded that Facebook’s interference with the user’s freedom of expression was mistaken. The removal of the comment would not protect the rights of Armenians to equality and non-discrimination. The user was not endorsing the statements contrasted in the meme, but rather attributing them to the Turkish government. They did this to condemn and raise awareness of the government’s contradictory and self-serving position. The majority found that the effects of satire, such as this meme, would be lessened if people had to explicitly declare their intent. The fact that the “two buttons” or “daily struggle” meme is usually intended to be humorous, even though the subject matter here was serious, also contributed to the majority’s decision.

The majority also noted that the content was shared in English on a Facebook page with followers based in several countries. While the meme could be misinterpreted by some Facebook users, the majority found that it does not increase the risk of Armenians being subjected to discrimination and violence, especially as the content is aimed at an international audience. They found that bringing this important issue to an international audience is in the public interest.

Additionally, the Board found that removing information without cause cannot be proportionate. Removing content that serves the public on a matter of public interest requires particularly weighty reasons to be proportionate. In this regard, the Board was concerned with Facebook content moderators’ capacity to review this meme and similar pieces of content containing satire. Contractors should follow adequate procedures and be provided with time, resources and support to assess satirical content and relevant context properly.

While supporting majority’s views on protecting satire on the platform, the minority did not believe that the content was satire. The minority found that the user could be embracing the statements contained in the meme, and thus engaging in discrimination against Armenians. Therefore, the minority held that the requirements of necessity and proportionality have been met in this case. In case decision 2021-002-FB-UA, the Board noted Facebook’s position that the content depicting blackface would be removed unless the user clearly indicated their intent to condemn the practice or raise awareness of it. The minority found that, similarly, where the satirical nature of the content is not obvious, as in this case, the user’s intent should be made explicit. The minority concluded that, while satire is about ambiguity, it should not be ambiguous regarding the target of the attack, i.e., the Turkish government or the Armenian people.

Right to be informed (Article 14, para. 3(a), ICCPR)

The Board found that the incorrect notice given to the user of the specific content rule violated implicates the right to be informed in the context of access to justice (Article 14, para. 3(a) ICCPR). When limiting a user’s right to expression, Facebook must respect due process and inform the user accurately of the basis of their decision, including by revising that notice where the reason is changed (General Comment No. 32, para. 31). Facebook failed that responsibility in this case.

9. Oversight Board decision

The Oversight Board overturns Facebook’s decision to remove the content and requires the content to be restored.

10. Policy advisory statement

The following recommendations are numbered, and the Board requests that Facebook provides an individual response to each as drafted:

Providing clear and accurate notice to users

To make its policies and their enforcement clearer for users, Facebook should:

1. Make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company. If Facebook determines that (i) the content does not violate the Community Standard notified to user, and (ii) that the content violates a different Community Standard, the user should be properly notified about it and given another opportunity to appeal. They should always have access to the correct information before coming to the Board.

2. Include the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard.

Having adequate tools in place to deal with issues of satire

To improve the accuracy of the enforcement of its content policies for the benefit of users, Facebook should:

3. Make sure that it has adequate procedures in place to assess satirical content and relevant context properly. This includes providing content moderators with: (i) access to Facebook’s local operation teams to gather relevant cultural and background information; and (ii) sufficient time to consult with Facebook’s local operation teams and to make the assessment. Facebook should ensure that its policies for content moderators incentivize further investigation or escalation where a content moderator is not sure if a meme is satirical or not.

Allowing users to communicate that their content falls within policy exceptions

To improve the accuracy of Facebook’s review in the appeals stage, the company should:

4. Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.

5. Ensure appeals based on policy exceptions are prioritized for human review.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.

Policies and topics
Freedom of expression, Humor, Politics
Cruel and insensitive
Region and countries
United States & Canada
United States
Platform
Facebook
Policies and topics
Freedom of expression, Humor, Politics
Cruel and insensitive
Region and countries
United States & Canada
United States
Platform
Facebook

Please note that this decision is available in both Turkish (via the ‘language’ tab accessed through the menu at the top of this screen) and Armenian (via this link).

Որոշման ամբողջական տարբերակը հայերենով կարդալու համար սեղմեք այստեղ.

Case summaryCase summary

The Oversight Board has overturned Facebook’s decision to remove a comment under its Hate Speech Community Standard. A majority of the Board found it fell into Facebook’s exception for content condemning or raising awareness of hatred.

About the case

On December 24, 2020, a Facebook user in the United States posted a comment with an adaptation of the ‘daily struggle’ or ‘two buttons’ meme. This featured the split-screen cartoon from the original ‘two buttons’ meme, but with a Turkish flag substituted for the cartoon character’s face. The cartoon character has its right hand on its head and appears to be sweating. Above the character, in the other half of the split-screen, are two red buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.”

While one content moderator found that the meme violated Facebook’s Hate Speech Community Standard, another found it violated its Cruel and Insensitive Community Standard. Facebook removed the comment under the Cruel and Insensitive Community Standard and informed the user of this.

After the user’s appeal, however, Facebook found that the content should have been removed under its Hate Speech Community Standard. The company did not tell the user that it upheld its decision under a different Community Standard.

Key findings

Facebook stated that it removed the comment as the phrase “The Armenians were terrorists that deserved it,” contained claims that Armenians were criminals based on their nationality and ethnicity. According to Facebook, this violated its Hate Speech Community Standard.

Facebook also stated that the meme was not covered by an exception which allows users to share hateful content to condemn it or raise awareness. The company claimed that the cartoon character could be reasonably viewed as either condemning or embracing the two statements featured in the meme.

The majority of the Board, however, believed that the content was covered by this exception. The ‘two buttons’ meme contrasts two different options not to show support for them, but to highlight potential contradictions. As such, they found that the user shared the meme to raise awareness of and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying these same historic atrocities. The majority noted a public comment which suggested that the meme, “does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it.” The majority also believed that the content could be covered by Facebook’s satire exception, which is not included in the Community Standards.

The minority of the Board, however, found that it was not sufficiently clear that the user shared the content to criticize the Turkish government. As the content included a harmful generalization about Armenians, the minority of the Board found that it violated the Hate Speech Community Standard.

In this case, the Board noted that Facebook told the user that they violated the Cruel and Insensitive Community Standard when the company based its enforcement on the Hate Speech Community Standard. The Board was also concerned about whether Facebook’s moderators had the necessary time and resources to review content containing satire.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the comment be restored.

In a policy advisory statement, the Board recommends that Facebook:

  • Inform users of the Community Standard enforced by the company. If Facebook determines that a user’s content violates a different Community Standard to the one the user was originally told about, they should have another opportunity to appeal.
  • Include the satire exception, which is not currently available to users, in the public language of its Hate Speech Community Standard.
  • Adopt procedures to properly moderate satirical content while taking into account relevant context. This includes providing content moderators with access to Facebook’s local operation teams and sufficient time to consult with these teams to make an assessment.
  • Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.
  • Make sure appeals based on policy exceptions are prioritized for human review.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decisionFull case decision

1. Decision summary

The Oversight Board has overturned Facebook’s decision to remove content under its Hate Speech Community Standard. A majority of the Board found that the cartoon, in the form of a satirical meme, fell into the Hate Speech Community Standard’s exception for content that condemns hatred or raises awareness of it.

2. Case description

On December 24, 2020, a Facebook user in the United States posted a comment with an adaption of the “daily struggle” or “two buttons” meme. A meme is a piece of media, which is often humorous, that spreads quickly across the internet. This featured the same-split screen cartoon from the original meme, but with the Turkish flag substituted for the cartoon character’s face. The cartoon character has its right hand on its head and appears to be sweating. Above the character, in the other half of the split-screen, there are two red buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.” The meme was preceded by a "thinking face" emoji.

The comment was shared on a public Facebook page that describes itself as forum for discussing religious matters from a secular perspective. It responded to a post containing an image of a person wearing a niqab with overlay text in English: “Not all prisoners are behind bars.” At the time the comment was removed, that original post it responded to had 260 views, 423 reactions and 149 comments. A Facebook user in Sri Lanka reported the comment for violating the Hate Speech Community Standard.

Facebook removed the meme on December 24, 2020. Within a short period of time, two content moderators reviewed the comment against the company’s policies and reached different conclusions. While the first concluded that the meme violated Facebook’s Hate Speech policy, the second determined that the meme violated the Cruel and Insensitive policy. The content was removed and logged in Facebook’s systems based on the second review. On this basis, Facebook notified the user that their comment “goes against our Community Standard on cruel insensitive content.”

After the user’s appeal, Facebook upheld its decision but found that the content should have been removed under its Hate Speech policy. For Facebook, the statement “The Armenians were terrorists that deserved it” specifically violated the prohibition on content claiming that all members of a protected characteristic are criminals, including terrorists. No other parts of the content, such as the claim that the Armenian genocide was a lie, were deemed to be violating. Facebook did not inform the user that it upheld the decision to remove their content under a different Community Standard.

The user submitted their appeal to the Oversight Board on December 24, 2020.

Lastly, in this decision, the Board referred to the atrocities committed against the Armenian people from 1915 onwards as genocide, as this term is commonly used to describe the massacres and mass deportations suffered by Armenians and it is also referred to in the content under review. The Board does not have the authority to legally qualify such atrocities and this qualification is not the subject of this decision.

3. Authority and scope

The Board has authority to review Facebook’s decision following an appeal from the user whose post was removed (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). The Board may uphold or reverse that decision (Charter Article 3, Section 5). The Board’s decisions are binding and may include policy advisory statements with recommendations. These recommendations are non-binding, but Facebook must respond to them (Charter Article 3, Section 4). The Board is an independent grievance mechanism to address disputes in a transparent and principled manner.

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

Facebook's Community Standards define hate speech as “a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.” Under “Tier 1,” prohibited content (“do not post”) includes content targeting a person or group of people on the basis of a protected characteristic with:

  • “dehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form) to or about […] criminals (including, but not limited to, “thieves”, “bank robbers”, or saying “All [protected characteristic or quasi-protected characteristic] are ‘criminals’”).”
  • speech “[m]ocking the concept, events or victims of hate crimes even if no real person is depicted in an image.”
  • speech “[d]enying or distorting information about the Holocaust.”

However, Facebook allows “content that includes someone else’s hate speech to condemn it or raise awareness.” According to the Hate Speech Community Standard’s policy rationale, “speech that might otherwise violate our standards can be used self-referentially or in an empowering way. Our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.”

Additionally, the Board noted Facebook’s Cruel and Insensitive Community Standard which forbids content that targets “victims of serious physical or emotional harm,” including “attempts to mock victims […] many of which take the form of memes and GIFs.” This policy prohibits content (“do not post”) that “contains sadistic remarks and any visual or written depiction of real people experiencing premature death.”

II. Facebook’s values

Facebook’s values are outlined in the introduction to the Community Standards. The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Facebook limits “Voice” in service of four values, and two are relevant here:

“Safety”: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.

“Dignity” : We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

II. Human rights standards

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In 2021, Facebook announced its Corporate Human Rights Policy, where it committed to respecting rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:

  1. Freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR), General Comment No. 34, Human Rights Committee, 2011; UN Special Rapporteur on freedom of opinion and expression reports: A/HRC/35/22/Add.3 (2017), A/HRC/41/35/Add.2 (2019), A/HRC/38/35 (2018), A/74/486 (2019), and A/HRC/44/49/Add.2 (2020); the Rabat Plan of Action, OHCHR, (2013).
  2. The right to non-discrimination: Article 2, para. 1, ICCPR; Articles 1 and 2, Convention on the Elimination of All Forms of Racial Discrimination (CERD).
  3. The right to be informed in the context of access to justice: Article 14, para. 3(a), ICCPR; General Comment No. 32, Human Rights Committee, (2007).

5. User statement

The user stated in their appeal to the Board that “historical events should not be censored.” They noted that their comment was not meant to offend but to point out “the irony of a particular historical event.” The user noted that “perhaps Facebook misinterpreted this as an attack.” The user further stated that even if the content invokes “religion and war,” it is not a “hot button issue.” The user found Facebook and its policies overly restrictive and argued that “[h]umor like many things is subjective and something offensive to one person may be funny to another.”

6. Explanation of Facebook’s decision

Facebook explained that it removed the comment as a Tier 1 attack under the Hate Speech Community Standard, specifically for violating its policy prohibiting content alleging that all members of a protected characteristic are criminals, including terrorists. According to Facebook, while the first statement in the meme “The Armenian Genocide is a lie” is a negative generalization, it did not directly attack Armenians and thus did not violate the company’s Community Standards. Facebook found that the second statement “The Armenians were terrorists that deserved it” directly attacked Armenians by alleging that they are criminals based on their ethnicity and nationality. This violated the company’s Hate Speech policy.

In its decision rationale, Facebook assessed whether the exception for content that shares hate speech to condemn it or raise awareness of it should apply in this case. Facebook argued that the meme did not fall into this exception, as the user was not clear they intended to condemn hate speech. Specifically, Facebook explained to the Board that the sweating cartoon character in the meme could be reasonably viewed as either condemning or embracing the statements. Facebook also explained that its Hate Speech policy previously included an exception for humor. The company clarified that it removed this exception in response to a Civil Rights Audit report (July 2020) and as part of its policy development. In its response to the Board, Facebook claimed that “creating a definition for what is perceived to be funny was not operational for Facebook’s at-scale enforcement.” However, in the Civil Rights Audit report, the company disclosed it maintained a narrower exception for satire which Facebook defines as content that “includes the use of irony, exaggeration, mockery and/or absurdity with the intent to expose or critique people, behaviors, or opinions, particularly in the context of political, religious, or social issues. Its purpose is to draw attention to and voice criticism about wider societal issues.” This exception is not included in its Community Standards. It appears to be separate from the exception for content that includes hate speech to condemn it or raise awareness of it.

Facebook also clarified that the content did not violate the Cruel and Insensitive policy, which prohibits “explicit attempts to mock victims,” including through memes, because it did not depict or name a real victim.

Facebook also stated that its removal of the content was consistent with its values of “Dignity” and “Safety,” when balanced against the value of “Voice.” According to Facebook, content that calls the Armenian people terrorists “is an affront to their dignity, can be experienced as demeaning or dehumanizing, and can even create risks of offline persecution and violence.”

Facebook argued that its decision was consistent with international human rights standards. Facebook stated that (a) its policy was “easily accessible” in the Community Standards, (b) the decision to remove the content was legitimate to protect “the rights of others from harm and discrimination,” and (c) its decision to remove the content was “necessary and proportionate to limit harm against Armenians.” To ensure that limits on expression were proportionate, Facebook argued that its Hate Speech policy applied to “a narrow set of generalizations.”

7. Third-party submissions

The Oversight Board received 23 public comments related to this case. Four of the comments were from Europe, one from Middle East and North Africa and 18 from United States and Canada.

The Board received comments from parties directly connected to issues of interest for this case. These included a descendant of victims of the Armenian genocide, organizations that study the nature, causes and consequences of genocide, as well as a former content moderator.

The submissions covered themes including: the meaning and use of the “daily struggle” or “two buttons” meme as adapted by the user in this case, whether the content was intended as a political critique of the Turkish government and its denial of the Armenian genocide, whether the content was mocking the victims of the Armenian genocide, and how Facebook’s Hate Speech and Cruel and Insensitive Community Standards relate to this case.

To read public comments submitted for this case, please click here.

8. Oversight Board analysis

The Board looked at the question of whether this content should be restored through three lenses: Facebook’s Community Standards; the company’s values; and its human rights responsibilities.

8.1 Compliance with Community Standards

The Board analyzed each of the two statements against Facebook’s Community Standards, before examining the effect of juxtaposing these statements in this version of the “daily struggle” or “two buttons” meme.

8.1.1. Analysis of the statement “The Armenian Genocide is a lie”

The Board noted that Facebook did not find this statement to violate its Hate Speech Community Standard. Facebook enforces its Hate Speech Community Standard by identifying (i) a “direct attack,” and (ii) a “protected characteristic” the direct attack was based upon. The policy rationale lists “dehumanizing speech” as an example of an attack. Ethnicity and national origin are included among the list of protected characteristics.

Under the “do not post” section of its Hate Speech policy, Facebook prohibits speech “[m]ocking the concept, events or victims of hate crimes even if no real person is depicted in an image.” A majority of the Board noted, however, that the user’s intent was not to mock the victims of the events referred to in the statement, but to use the meme, in the form of satire, to criticize the statement itself. For the minority, the user’s intent was not sufficiently clear. The user could be sharing the content to embrace the statement rather than to refute it.

In this case, Facebook notified the user that their content violated the Cruel and Insensitive Community Standard. Under this policy, Facebook prohibits “attempts to mock victims [of serious physical or emotional harm],” including content that “contains sadistic remarks and any visual or written depiction of real people experiencing premature death.” The Board noted but did not consider Facebook’s explanation that this policy is not applicable to this case because the meme does not depict or name the victims of the events referred to in the statement.

Under the “do not post” section of its Hate Speech policy, Facebook also prohibits speech “[d]enying or distorting information about the Holocaust.” The Board noted the company’s explanation that this policy does not apply to the Armenian genocide or other genocides, and that this policy was based on the company’s “ consultation with external experts, the well-documented rise in anti-Semitism globally, and the alarming level of ignorance about the Holocaust.”

8.1.2. Analysis of the statement “The Armenians were terrorists that deserved it”

The Board noted that Facebook found this statement to violate its Hate Speech Community Standard. The “do not post” section of this Hate Speech Community Standard prohibits “[d]ehumanizing speech or imagery in the form of comparisons, generalizations, or unqualified behavioral statements (in written or visual form).” The policy includes speech that portrays the targeted group as “criminals.” The Board believed the term “terrorists” fell into this category.

8.1.3 Analysis of the combined statements in the meme

The Board is of the view that one should evaluate the content as a whole, including the effect of juxtaposing these statements in a well-known meme. A common purpose of the “daily struggle” or “two buttons” meme is to contrast two different options to highlight potential contradictions or other connotations, rather than to indicate support for the options presented.

For the majority, the exception to the Hate Speech policy is crucial. This exception allows people to “share content that includes someone else’s hate speech to condemn it or raise awareness.” It also states: “our policies are designed to allow room for these types of speech, but we require people to clearly indicate their intent. If intention is unclear, we may remove content.” The majority noted that the content could also fall under the company’s satire exception, which is not publicly available.

Assessing the content as a whole, the majority found that the user’s intent was clear. They shared the meme as satire to raise awareness about and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying the same historic atrocities. The user’s intent was not to mock the victims of these events, nor to claim those victims were criminals or that the atrocity was justified. The majority took into account the Turkish government’s position on genocide suffered by Armenians from 1915 onwards ( Republic of Turkey, Ministry of Foreign Affairs) as well as the history between Turkey and Armenia. In this context, they found that the cartoon character’s sweating face replaced with a Turkish flag and the content’s direct link to the Armenian genocide, meant the user shared the meme to criticize the Turkish government’s position on this issue. The use of the "thinking face" emoji, which is commonly used sarcastically, alongside the meme, supports this conclusion. The majority noted public comment “PC-10007” (made available under section 7 above), which suggested that “this meme, as described, does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it.” It would thus be wrong to remove this comment in the name of protecting Armenians, when the post is a criticism of the Turkish government, in support of Armenians.

As such, the majority found that, taken as a whole, the content fell within the policy exception in Facebook’s Hate Speech Community Standard. For the minority, in the absence of specific context, the user’s intent was not sufficiently clear to conclude that the content was shared as satire criticizing the Turkish government. Additionally, the minority found that the user was not able to properly articulate what the alleged humor intended to express. Given the content includes a harmful generalization against Armenians, the minority found that it violated the Hate Speech Community Standard.

8.2 Compliance with Facebook’s values

A majority of the Board believed that restoring this content is consistent with Facebook’s values. The Board recognized the Armenian community’s sensitivity to statements concerning the mass-scale atrocities suffered by Armenians from 1915 onwards, as well as the community’s long struggle to seek recognition of the genocide and justice for these atrocities. However, the majority does not find any evidence that the meme in this case posed a risk to “Dignity” and “Safety” that would justify displacing “Voice.” The majority also noted Facebook’s broad reference to “Safety,” without explaining how this value was applied in this case.

The minority found that while satire should be protected, as the majority rightly stated, the statements in the comment damage the self-respect of people whose ancestors suffered genocide. The minority also found the statements to be disrespectful of the honor of those who were massacred and harmful, as it could increase the risk of discrimination and violence against Armenians. This justified displacing “Voice” to protect “Safety” and “Dignity.”

8.3 Compliance with Facebook’s human rights responsibilities

Freedom of expression (Article 19 ICCPR)

Article 19, para. 2 of the ICCPR provides broad protection for expression of “all kinds,” including written and non-verbal “political discourse,” as well as “cultural and artistic expression.” The UN Human Rights Committee has made clear the protection of Article 19 extends to expression that may be considered “deeply offensive” (General Comment No. 34, paras. 11, 12).

In this case, the Board found that the cartoon, in the form of a satirical meme, took a position on a political issue: the Turkish government’s stance on the Armenian genocide. The Board noted that “cartoons that clarify political positions” and “memes that mock public figures” may be considered forms of artistic expression protected under international human rights law (UN Special Rapporteur on freedom of expression, report A/HRC/44/49/Add.2, at para. 5). The Board further emphasized that the value placed by the ICCPR upon uninhibited expression concerning public figures in the political domain and public institutions “is particularly high” (General Comment No. 34, para. 38).

The Board also noted that laws establishing general prohibitions of expressions with incorrect opinions or interpretations of historical facts, often justified through references to hate speech, are incompatible with Article 19 of the ICCPR, unless they amount to incitement of hostility, discrimination or violence under Article 20 of the ICCPR (General Comment 34, para. 29; UN Special Rapporteur on freedom of expression, report A/74/486, at para. 22).

While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook should seek to align its content moderation policies on hate speech with these principles (UN Special Rapporteur on freedom of expression, report A/74/486, at para. 58(b)).

I. Legality

Any rules restricting expression must be clear, precise, and publicly accessible (General Comment 34, para. 25). Individuals must have enough information to determine if and how their speech may be limited, so that they can adjust their behavior accordingly. Facebook’s Community Standards “permit content that includes someone else’s hate speech to condemn it or raise awareness,” but ask users to “clearly indicate their intent.” In addition, the Board also noted that Facebook removed an exception for humor from its Hate Speech policy following a Civil Rights Audit concluded in July 2020. While this exception was removed, the company kept a narrower exception for satire that is currently not communicated to users in its Hate Speech Community Standard.

The Board also noted that Facebook wrongfully reported to the user that they violated the Cruel and Insensitive Community Standard, when Facebook based its enforcement on the Hate Speech policy. The Board found that it is not clear enough to users that the Cruel and Insensitive Community Standard only applies to content that depicts or names victims of harm.

Additionally, the Board found that properly notifying users of the reasons for enforcement action against them would help users follow Facebook’s rules. This relates to the legality issue, as the lack of relevant information for users subject to content removal “creates an environment of secretive norms, inconsistent with the standards of clarity, specificity and predictability” which may interfere with “the individual’s ability to challenge content actions or follow up on content-related complaints.” (UN Special Rapporteur on freedom of expression, report A/HCR/38/35, at para. 58). Facebook’s approach to user notice in this case therefore failed the legality test.

II. Legitimate aim

Any restriction on freedom of expression should also pursue a “legitimate aim.” The Board agreed the restriction pursued the legitimate aim of protecting the rights of others (General Comment No. 34, para. 28). These include the rights to equality and non-discrimination, including based on ethnicity and national origin (Article 2, para. 1, ICCPR; Articles 1 and 2, ICERD).

The Board also reaffirmed its finding in case decision 2021-002-FB-UA that “it is not a legitimate aim to restrict expression for the sole purpose of protecting individuals from offense (UN Special Rapporteur on freedom of expression, report A/74/486, para. 24), as the value international human rights law placed on uninhibited expression is high (General Comment No. 34, para. 38).”

III. Necessity and proportionality

Any restrictions on freedom of expression “must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected” (General Comment 34, para. 34).

The Board assessed whether the content removal was necessary to protect the rights of Armenians to equality and non-discrimination. The Board noted that freedom of expression currently faces substantial restrictions in Turkey, with disproportionate effects on ethnic minorities living in the country, including Armenians. In a report on his mission to Turkey in 2016, the UN Special Rapporteur on freedom of expression found censorship to be operating in “all the places that are fundamental to democratic life: the media, educational institutions, the judiciary and the bar, government bureaucracy, political space and the vast online expanses of the digital age” (UN Special Rapporteur on freedom of expression, report A/HRC/35/22/Add.3, at para. 7). In the follow-up report published in 2019, the UN Special Rapporteur mentioned that the situation had not improved (UN Special Rapporteur on freedom of expression, report A/HRC/41/35/Add.2, at para. 26).

Turkish authorities have specifically targeted expression denouncing the atrocities committed by the Turkish Ottoman Empire against Armenians from 1915 onwards. In a joint allegation letter, a number of UN special procedures mentioned that Article 301 of the Turkish Criminal Code appears to constitute “a deliberate effort to obstruct access to the truth about what appears to be policy of violence directed against the Turkish Armenian community” and “the right of victims to justice and reparation.” The Board also noted the assassination, in 2007, of Hrant Dink, a journalist of Armenian origin who published a number of articles on the identity of Turkish citizens of Armenian origin. In one of these articles, Dink discussed the lack of recognition of the genocide and how this affects the identity of Armenians. Dink was previously found guilty of demeaning the “Turkish identity” through his writing by Turkish courts. In 2010, the European Court of Human Rights concluded that the verdict of Dink and the failure of Turkish authorities to take the appropriate measures to protect his life amounted to a violation of his freedom of expression (see European Court of Human Rights, Dink v Turkey, para. 139).

A majority of the Board concluded that Facebook’s interference with the user’s freedom of expression was mistaken. The removal of the comment would not protect the rights of Armenians to equality and non-discrimination. The user was not endorsing the statements contrasted in the meme, but rather attributing them to the Turkish government. They did this to condemn and raise awareness of the government’s contradictory and self-serving position. The majority found that the effects of satire, such as this meme, would be lessened if people had to explicitly declare their intent. The fact that the “two buttons” or “daily struggle” meme is usually intended to be humorous, even though the subject matter here was serious, also contributed to the majority’s decision.

The majority also noted that the content was shared in English on a Facebook page with followers based in several countries. While the meme could be misinterpreted by some Facebook users, the majority found that it does not increase the risk of Armenians being subjected to discrimination and violence, especially as the content is aimed at an international audience. They found that bringing this important issue to an international audience is in the public interest.

Additionally, the Board found that removing information without cause cannot be proportionate. Removing content that serves the public on a matter of public interest requires particularly weighty reasons to be proportionate. In this regard, the Board was concerned with Facebook content moderators’ capacity to review this meme and similar pieces of content containing satire. Contractors should follow adequate procedures and be provided with time, resources and support to assess satirical content and relevant context properly.

While supporting majority’s views on protecting satire on the platform, the minority did not believe that the content was satire. The minority found that the user could be embracing the statements contained in the meme, and thus engaging in discrimination against Armenians. Therefore, the minority held that the requirements of necessity and proportionality have been met in this case. In case decision 2021-002-FB-UA, the Board noted Facebook’s position that the content depicting blackface would be removed unless the user clearly indicated their intent to condemn the practice or raise awareness of it. The minority found that, similarly, where the satirical nature of the content is not obvious, as in this case, the user’s intent should be made explicit. The minority concluded that, while satire is about ambiguity, it should not be ambiguous regarding the target of the attack, i.e., the Turkish government or the Armenian people.

Right to be informed (Article 14, para. 3(a), ICCPR)

The Board found that the incorrect notice given to the user of the specific content rule violated implicates the right to be informed in the context of access to justice (Article 14, para. 3(a) ICCPR). When limiting a user’s right to expression, Facebook must respect due process and inform the user accurately of the basis of their decision, including by revising that notice where the reason is changed (General Comment No. 32, para. 31). Facebook failed that responsibility in this case.

9. Oversight Board decision

The Oversight Board overturns Facebook’s decision to remove the content and requires the content to be restored.

10. Policy advisory statement

The following recommendations are numbered, and the Board requests that Facebook provides an individual response to each as drafted:

Providing clear and accurate notice to users

To make its policies and their enforcement clearer for users, Facebook should:

1. Make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company. If Facebook determines that (i) the content does not violate the Community Standard notified to user, and (ii) that the content violates a different Community Standard, the user should be properly notified about it and given another opportunity to appeal. They should always have access to the correct information before coming to the Board.

2. Include the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard.

Having adequate tools in place to deal with issues of satire

To improve the accuracy of the enforcement of its content policies for the benefit of users, Facebook should:

3. Make sure that it has adequate procedures in place to assess satirical content and relevant context properly. This includes providing content moderators with: (i) access to Facebook’s local operation teams to gather relevant cultural and background information; and (ii) sufficient time to consult with Facebook’s local operation teams and to make the assessment. Facebook should ensure that its policies for content moderators incentivize further investigation or escalation where a content moderator is not sure if a meme is satirical or not.

Allowing users to communicate that their content falls within policy exceptions

To improve the accuracy of Facebook’s review in the appeals stage, the company should:

4. Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.

5. Ensure appeals based on policy exceptions are prioritized for human review.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context. The company Lionbridge Technologies, LLC, whose specialists are fluent in more than 350 languages and work from 5,000 cities across the world, provided linguistic expertise.