This website is currently undergoing maintenance and will be back soon.

Overturned

Karachi Mayoral Election Comment

A Facebook user appealed Meta’s decision to remove their comment showing the 2023 Karachi mayoral election results and containing the name of Tehreek-e-Labbaik Pakistan (TLP), a party designated under Meta’s Dangerous Organizations and Individuals policy.

Type of Decision

Summary

Policies and Topics

Topic
Elections, Freedom of expression, Politics
Community Standard
Dangerous individuals and organizations

Region/Countries

Location
Pakistan

Platform

Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A Facebook user appealed Meta’s decision to remove their comment showing the 2023 Karachi mayoral election results and containing the name of Tehreek-e-Labbaik Pakistan (TLP), a far-right Islamist political party designated under Meta’s Dangerous Organizations and Individuals policy. This case highlights the over-enforcement of this policy and its impact on users’ ability to share political commentary and news reporting. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the comment.

Case Description and Background

In June 2023, a Facebook user commented on a post of a photograph of Karachi politician Hafiz Naeem ur Rehman with former Pakistani Prime Minister Imran Khan and Secretary General of the Jamaat-e-Islami political party, Liaqat Baloch. The comment is an image of a graph taken from a television program, which shows the number of seats won by the various parties in the Karachi mayoral election. One of the parties included in the list is Tehreek-e-Labbaik Pakistan (TLP), a far-right Islamist political party in Pakistan. The 2023 Karachi mayoral election was a contested race, with one losing party alleging that the vote was unfairly rigged and ensuing violent protests taking place between supporters of different parties.

Meta originally removed the comment from Facebook, citing its Dangerous Organizations and Individuals policy, under which the company removes content that "praises,” “substantively supports” or “represents” individuals and organizations it designates as dangerous. However, the policy recognizes that “users may share content that includes references to designated dangerous organizations and individuals in the context of social and political discourse. This includes content reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities.”

In the appeal to the Board, the user identified themselves as a journalist and stated that the comment was about the Karachi mayoral election results. The user clarified that the intention of the comment was to inform the public and discuss the democratic process.

After the Board brought this case to Meta’s attention, the company determined the content did not violate its policies. Meta’s policy allows for neutral discussion of a designated entity in the context of social and political discourse, in this case, reporting on the outcome of an election.

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

This case highlights over-enforcement of Meta’s Dangerous Organizations and Individuals policy. The Board’s cases suggest that errors of this sort are all too frequent. They impede users’ – especially journalists’ – abilities to report factual information about organizations labeled as dangerous. The company should make reducing such errors a high priority.

The Board has issued several recommendations regarding Meta’s Dangerous Organizations and Individuals policy. These included a recommendation to “evaluate automated moderation processes for enforcement of the DOI policy,” which Meta declined to implement ( Öcalan’s Isolation decision, recommendation no. 2). The Board has also recommended that Meta “assess the accuracy of reviewers enforcing the reporting allowance under the DOI policy to identify systemic issues causing enforcement errors,” ( Mention of the Taliban in News Reporting decision, recommendation no. 5). Meta is in the process of implementing an update to its Dangerous Organizations and Individuals policy, which will include details about how Meta approaches news reporting as well as neutral and condemning discussion. Furthermore, the Board has recommended Meta “provide a public list of the organizations and individuals designated ‘dangerous’ under the Dangerous Individuals and Organizations Community Standard,” which Meta declined to implement after a feasibility assessment, ( Nazi Quote decision, recommendation no. 3).

Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions