Oversight Board Overturns Facebook Decision in Protest in India Against France Case

The Oversight Board has overturned Facebook’s decision to remove a post under its Violence and Incitement Community Standard. While the company considered that the post contained a veiled threat, a majority of the Board believed it should be restored. This decision should only be implemented pending user notification and consent.

About the case

In late October 2020, a Facebook user posted in a public group described as a forum for Indian Muslims. The post contained a meme featuring an image from the Turkish television show “Diriliş: Ertuğrul” depicting one of the show’s characters in leather armor holding a sheathed sword. The meme had a text overlay in Hindi. Facebook’s translation of the text into English reads: “if the tongue of the kafir starts against the Prophet, then the sword should be taken out of the sheath.” The post also included hashtags referring to President Emmanuel Macron of France as the devil and calling for the boycott of French products.

In its referral, Facebook noted that this content highlighted the tension between what it considered religious speech and a possible threat of violence, even if not made explicit.

Key findings

Facebook removed the post under its Violence and Incitement Community Standard, which states that users should not post coded statements where “the threat is veiled or implicit.” Facebook identified “the sword should be taken out of the sheath” as a veiled threat against “kafirs,” a term which the company interpreted as having a retaliatory tone against non-Muslims.

Considering the circumstances of the case, the majority of the Board did not believe that this post was likely to cause harm. They questioned Facebook’s rationale, which indicated that threats of violence against Muslims increased Facebook’s sensitivity to such threats, but also increased sensitivity when moderating content from this group.

While a minority viewed the post as threatening some form of violent response to blasphemy, the majority considered the references to President Macron and the boycott of French products as calls to action that are not necessarily violent. Although the television show character holds a sword, the majority interpreted the post as criticizing Macron’s response to religiously motivated violence, rather than threatening violence itself. The Board notes that its decision to restore this post does not imply endorsement of its content.

Under international human rights standards, people have the right to seek, receive and impart ideas and opinions of all kinds, including those that may be controversial or deeply offensive. As such, a majority considered that just as people have the right to criticize religions or religious figures, religious people also have the right to express offense at such expression.

Restrictions on expression must be easily understood and accessible. In this case, the Board noted that Facebook’s process and criteria for determining veiled threats is not explained to users in the Community Standards.

In conclusion, a majority found that, for this specific post, Facebook did not accurately assess all contextual information and that international human rights standards on expression justify the Board’s decision to restore the content.

The Oversight Board’s decision

The Board overturns Facebook’s decision to take down the content, requiring the post to be restored.

As a policy advisory statement, the Board recommends that:

  • This decision should only be implemented pending user notification and consent.
  • Facebook provide users with additional information regarding the scope and enforcement of restrictions on veiled threats. This would help users understand what content is allowed in this area. Facebook should make their enforcement criteria public. These should consider the intent and identity of the user, as well as their audience and the wider context.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Return to News