Oversight Board Overturns Facebook Decision in “Two Buttons” Meme Case

The Oversight Board has overturned Facebook’s decision to remove a comment under its Hate Speech Community Standard. A majority of the Board found it fell into Facebook’s exception for content condemning or raising awareness of hatred.

About the Case

On December 24, 2020, a Facebook user in the United States posted a comment with an adaptation of the ‘daily struggle’ or ‘two buttons’ meme. This featured the split-screen cartoon from the original ‘two buttons’ meme, but with a Turkish flag substituted for the cartoon character’s face. The cartoon character has its right hand on its head and appears to be sweating. Above the character, in the other half of the split-screen, are two red buttons with corresponding statements in English: “The Armenian Genocide is a lie” and “The Armenians were terrorists that deserved it.”

While one content moderator found that the meme violated Facebook’s Hate Speech Community Standard, another found it violated its Cruel and Insensitive Community Standard. Facebook removed the comment under the Cruel and Insensitive Community Standard and informed the user of this.

After the user’s appeal, however, Facebook found that the content should have been removed under its Hate Speech Community Standard. The company did not tell the user that it upheld its decision under a different Community Standard.

Key Findings

Facebook stated that it removed the comment as the phrase “The Armenians were terrorists that deserved it,” contained claims that Armenians were criminals based on their nationality and ethnicity. According to Facebook, this violated its Hate Speech Community Standard.

Facebook also stated that the meme was not covered by an exception which allows users to share hateful content to condemn it or raise awareness. The company claimed that the cartoon character could be reasonably viewed as either condemning or embracing the two statements featured in the meme.

The majority of the Board, however, believed that the content was covered by this exception. The ‘two buttons’ meme contrasts two different options not to show support for them, but to highlight potential contradictions. As such, they found that the user shared the meme to raise awareness of and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying these same historic atrocities. The majority noted a public comment which suggested that the meme, “does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it.” The majority also believed that the content could be covered by Facebook’s satire exception, which is not included in the Community Standards.

The minority of the Board, however, found that it was not sufficiently clear that the user shared the content to criticize the Turkish government. As the content included a harmful generalization about Armenians, the minority of the Board found that it violated the Hate Speech Community Standard.

In this case, the Board noted that Facebook told the user that they violated the Cruel and Insensitive Community Standard when the company based its enforcement on the Hate Speech Community Standard. The Board was also concerned about whether Facebook’s moderators had the necessary time and resources to review content containing satire.

The Oversight Board’s Decision

The Oversight Board overturns Facebook’s decision to remove the content and requires that the comment be restored.

In a policy advisory statement, the Board recommends that Facebook:

  • Inform users of the Community Standard enforced by the company. If Facebook determines that a user’s content violates a different Community Standard to the one the user was originally told about, they should have another opportunity to appeal.
  • Include the satire exception, which is not currently available to users, in the public language of its Hate Speech Community Standard.
  • Adopt procedures to properly moderate satirical content while taking into account relevant context. This includes providing content moderators with access to Facebook’s local operation teams and sufficient time to consult with these teams to make an assessment.
  • Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.
  • Make sure appeals based on policy exceptions are prioritized for human review.

For Further Information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Return to News