Meta Must Loosen Restriction on Online Speech in Arabic-speaking and Muslim Communities, Warns the Oversight Board  

  • Oversight Board review of Meta’s most censored word, “shaheed,” suggests ways to tackle discriminatory impact on expression and news reporting   
  • Conflict in Gaza driving fresh concerns that policy may be contributing to censorship of those commenting on the violence, with Board’s proposals shown to be most suitable approach even in this time of crisis   
  • Meta was right to request review but now urged to adopt more nuanced policy position   

Meta’s current restrictive approach to moderation, based on concerns about how the word “shaheed” could be used to praise or approve of terrorism, has led to widespread and unnecessary censorship affecting the freedom of expression of millions of users, the Oversight Board said on Tuesday.  

The company has long struggled with how to account for the linguistic complexity, and the cultural and religious significance, of the Arabic word “shaheed,” which has multiple meanings but can be roughly translated to “martyr” in English.  

The Board’s extensive review found that Meta should end its blanket ban on “shaheed,” when linked to people Meta considers to be terrorists, as this has had a discriminatory and disproportionate impact on freedom of expression and information sharing, outweighing concerns the word could have been used to promote terrorism. As part of its review, Meta asked the Board to reevaluate its internal trade-offs between speech and safety, admitting that its approach could have caused swathes of content to be wrongly removed.  

“Terrorism destroys lives and undermines the very fabric of our societies, but it is counterproductive to stop journalists from reporting on terrorist groups and to limit people’s ability to debate and condemn the violence they see around them just because of the presence of a single word,” said Oversight Board co-chair Helle Thorning-Schmidt.   

“Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalize whole populations while not improving safety at all. The company was right to ask the Board to investigate this extremely complex issue and to be transparent about the challenges. We have shown how to change course and urge Meta to adopt a more nuanced approach that keeps people safe without unnecessarily limiting speech in many parts of the world.”  

The conflict in Gaza, which has seen complaints that voices of people affected by the war have been silenced, has only further intensified calls to improve content moderation during crisis.   

The Board was finalizing this Policy Advisory Opinion (PAO), a deep dive into a complex policy issue at Meta, when the October 7, 2023 terrorist attacks in Israel occurred, followed by ongoing Israeli military operations in Gaza. The Board extended its research to see how people use Meta’s platforms and the word “shaheed” in this context and found that our recommendations held up even under the stress of the current phase of the conflict.    

To better protect all users, the Board said Meta should end the blanket ban on “shaheed” when used in reference to people Meta designates as terrorists. Instead, it should only remove it when it is linked to a clear sign of violence (like imagery of weapons) or when it otherwise breaks Meta’s rules (for example, glorifying a designated individual). This will see the most harmful material removed, while minimizing the chances of intentionally or accidentally removing non-violating content posted around the world. Aside from in Arabic, “shaheed” is a common loanword used by Muslim communities, most notably across Asia, Africa and the Middle East. It can also be used by non-Muslims who speak languages and dialects where it is common.   

“This blunt method is doing more harm than good. It can even lead to those speaking about deceased loved ones having their content taken down in error,” said Thorning-Schmidt. “The reality is that communities worst hit by the current policy, such as those living in conflict zones like Gaza and Sudan, also live in contexts where censorship is rife. The Board is especially concerned that Meta’s approach impacts journalism and civic discourse because media organizations and commentators might shy away from reporting on designated entities to avoid content removals.”    

While the company has always interpreted the term when used to refer to designated individuals as a violation of its policies, “shaheed” has a myriad of other non-violating uses that are susceptible to over-enforcement, including in news reporting, discussions of terrorist actors, and to describe those who are the victims of terrorism or other kinds of violence. Meta has various other policies for addressing the glorification of terrorism and violence online. These are robust and, when applied and enforced properly, should ensure violating content would not make it onto the platform, the Board said.  

Meta conducted a policy review into its moderation of “shaheed” in 2020 but was unable to decide on how to proceed and asked the Board to intervene last year.    

When conflict last escalated in Gaza in 2021, the Board took a case about neutral news reporting from Al Jazeera that was wrongly removed. The Board expressed concern about the unfair removal of Arabic content and pushed the company to conduct an independent human rights review. This found Meta’s policies adversely impacted Arabic speakers’ freedom of expression and assembly. While Meta committed to undertaking various reforms, the new phase of the conflict has raised fresh questions about discriminatory censorship.   

Note to Editors:    

*The Board recommends that posts including “shaheed” in reference to a Tier 1 designated individual should only be removed when accompanied with one or more of these signals of violence: a visual depiction of arms, a statement of intent or advocacy to use or carry arms, or a reference to a designated event.    

Background:    

The Oversight Board is an independent deliberative body comprising global experts that hold Meta accountable to its Community Standards for Facebook, Instagram and Threads, as well as to its human rights commitments. The Board has contractually binding authority over Meta’s decisions to remove or leave content up on its platforms. The Board also issues non-binding recommendations that shape Meta’s policies to ensure the company is more transparent, enforces its rules evenly and treats users more fairly. The Oversight Board decisions are approved by the majority of the Board and do not necessarily represent the personal views of all Members. Meta has 60 days to respond to the Board’s recommendations.  

प्रेस विज्ञप्तियां पर लौटें