This website is currently undergoing maintenance and will be back soon.

Overturned

Bengali Debate about Religion

A user appealed Meta’s decision to remove a Facebook post with a link to a YouTube video which addressed Islamic scholars’ unwillingness to discuss atheism.

Type of Decision

Summary

Policies and Topics

Topic
Freedom of expression, Marginalized communities, Religion
Community Standard
Coordinating harm and publicizing crime

Region/Countries

Location
Bangladesh, India

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post with a link to a YouTube video that addressed Islamic scholars’ unwillingness to discuss atheism. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

Case Description and Background

In May 2023, a user who identifies themselves as an atheist and critic of religion posted a link to a YouTube video on Facebook. The thumbnail image of the video asks, in Bengali, “Why are Islamic scholars afraid to debate the atheists on video blogs?” and contains an image of two Islamic scholars. The caption of the post states, “Join the premiere to get the answer!” The content had approximately 4,000 views.

In their appeal to the Board, the user claimed that the purpose of sharing the video was to promote a "healthy debate or discussion” with Islamic scholars, specifically on topics such as the theory of evolution and Big Bang theory. The user states that this post adheres to Facebook’s Community Standards by “promoting open discussion.” Furthermore, the user stressed that Bangladeshi atheist activists are frequently subject to censorship and physical harms.

Meta initially removed the content under its Coordinating Harm and Promoting Crime policy, which prohibits content “facilitating, organizing, promoting or admitting to certain criminal or harmful activities targeted at people, businesses, property or animals.” Meta acknowledged this content does not violate this policy although the views espoused by the atheist may be viewed as “provocative to many Bangladeshis.” Meta offered no further explanation regarding why the content was removed from the platform. Although a direct attack against people based on their religious affiliation could be removed for hate speech, a different Meta policy, there is no prohibition in the company’s policies against critiquing a religion’s concepts or ideologies.

After the Board brought this case to Meta’s attention, the company determined that the content did not violate the Coordinating Harm and Promoting Crime policy and the removal was incorrect. The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights an error in Meta’s enforcement of its Coordinating Harm and Promoting Crime policy. These types of enforcement errors further limit freedom of expression for members of groups who are already subjected to intense censorship by state actors.

Meta’s Coordinating Harm and Promoting Crime policy says that Meta allows users to discuss and debate “harmful activities” but only “so long as they do not advocate or coordinate harm,” with many of the policy’s clauses revolving around a user’s intent. In this case, while the user’s post could be interpreted as provocative – given the documented animosity towards Bangladeshi atheist activists – the user was not advocating or coordinating harm under Meta’s definition of such activities, highlighting a misinterpretation of the user’s intent. Previously, the Board has issued recommendations that the company clarify for users how they can make a non-violating intent clear on similar distinctions as in the company’s Dangerous Organizations and Individuals policy. Regarding that policy, the Board urged Meta to, “Explain in the Community Standards how users can make the intent behind their posts clear to [Meta] ... Facebook should also provide illustrative examples to demonstrate the line between permitted and prohibited content,” ( Ocalan’s Isolation decision, recommendation no. 6). Meta partially implemented this recommendation.

Furthermore, the Board has issued recommendations aimed at preventing enforcement errors more broadly. The Board asked Meta to “[i]mplement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes” ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Meta claimed that this was work it already does, without publishing information to demonstrate so. Additionally, the Board requested that Meta “[i]mprove its transparency reporting to increase public information on error rates by making the information viewable by country and language for each Community Standard… more detailed transparency reports will help the public spot areas where errors are more common, including potential specific impacts on minority groups, and alert Facebook to correct them,” ( Punjabi Concern Over the RSS in India decision, recommendation no. 3). Meta is still assessing the feasibility of this recommendation.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought this case to the company’s attention. The Board also urges Meta to speed up the implementation of still open recommendations to reduce such errors.

Return to Case Decisions and Policy Advisory Opinions