पलट जाना

Promotional Clip for Interview in Iraq

A user appealed Meta’s decision to remove a Facebook post of a video featuring a clip of a televised interview with a prominent Iraqi Shia cleric and political figure. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

निर्णय का प्रकार

सारांश

नीतियां और विषय

विषय
न्यूज़ ईवेंट, पत्रकारिता, राजनीति
सामुदायिक मानक
ख़तरनाक लोग और संगठन

क्षेत्र/देश

जगह
इराक

प्लैटफ़ॉर्म

प्लैटफ़ॉर्म
Facebook

Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Summary

A user appealed Meta’s decision to remove a Facebook post of a video featuring a clip of a televised interview with a prominent Iraqi Shia cleric and political figure. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.

About the Case

In August 2023, the head of the Iraqi Media Network, a government holding company for media organizations, posted on Facebook a clip of an interview with Qais Hadi Sayed Hasan al-Khazali, a politician, cleric and militant leader of Asa’ib Ahl al-Haq, an Iraqi Shia paramilitary organization and political party. In the clip, al-Khazali talks about various topics, including the United States’ and Iraq’s relations with Israel. An accompanying caption, in Arabic, reads: “Tonight ... an important discussion on the latest political developments and recent foreign forces movements with Sheikh Qais Al-Khazali, Secretary-General of Asa’ib Ahl al-Haq movement,” and contains the hashtag “#IraqiNews.” Overlaid text in Arabic on the video reads: “Exclusive Interview” and “Tonight at 10 p.m.”

Meta under its Dangerous Organizations and Individuals (DOI) policy designates al-Khazali as a dangerous individual and Asa’ib Ahl al-Haq as a dangerous organization. The company explains that Tier 1 of this policy focuses on “terrorist organizations, including entities and individuals designated by the United States government as foreign terrorist organizations (FTOs) or specially designated global terrorists (SDGTs).” However, Meta does not publicly disclose its list of DOIs.

Under the DOI Community Standard, the company removes the “glorification, support and representation” of individuals or organizations that “proclaim a violent mission or are engaged in violence.” The policy allows for “content that includes references to designated dangerous organizations and individuals in the context of social and political discourse,” including content “reporting on, neutrally discussing or condemning” dangerous organizations and individuals or their activities. Under this policy, Meta explains that news reporting includes “information that is shared to raise awareness about local and global events in which designated dangerous organizations and individuals are involved.”

After the Board brought this case to Meta’s attention, the company determined that the content should not have been removed under the DOI policy, given that the overlay text “Exclusive Interview” and the hashtag “Iraqi News” suggested that the content was reporting on an interview with Al-Khazali. Additionally, the company noted that “the clips do not convey any positive judgement around Al-Khazali or his actions.” The company then restored the content to Facebook.

Board Authority and Scope

The Board has authority to review Meta’s decision following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.

Significance of Case

This case highlights the over-enforcement of Meta’s DOI Community Standard. The company’s content moderation systems failed to recognize the allowance for reporting involving designated entities. The Board previously noted in the Karachi Mayoral Election Comment summary decision that such mistakes can negatively impact users’ ability to “share political commentary and news reporting” about organizations labeled as “dangerous,” therefore infringing on freedom of expression.

The Board has issued several relevant previous recommendations aiming to increase transparency around and the accuracy of the enforcement of the DOI policy, including its exceptions. Firstly, the Board has recommended that Meta “assess the accuracy of reviewers enforcing the reporting allowance under the Dangerous Organizations and Individuals policy in order to identify systemic issues causing enforcement errors” ( Mention of the Taliban in News Reporting, recommendation no. 5). In November 2023, Meta reported implementing an update to the DOI policy, which included details about how the company approaches news reporting as well as neutral and condemning discussion. Meta also reported “tracking the accuracy of enforcement of the social and political discourse carveout” (Meta’s Q3 2023 Quarterly Update on the Oversight Board). The company, however, has not published information to demonstrate implementation.

Additionally, in one of its policy advisory opinions, the Board asked Meta to “explain the methods it uses to assess the accuracy of human review and the performance of automated systems in the enforcement of its Dangerous Organizations and Individuals policy” ( Referring to Designated Individuals as Shaheed, recommendation no. 6). The Board considered that this recommendation has been reframed by Meta. The company stated it conducts audits to assess the accuracy of its content moderation decisions and that this informs areas for improvement. Meta did not, however, explain the methods it deploys to perform these assessments.

The repeated overenforcement of the company’s DOI policy undermines the ability of users to share legitimate news reporting and information about the activities of designated individuals or organizations. The Board believes that full implementation of both recommendations mentioned above would further strengthen Meta’s ability to improve enforcement accuracy, reducing errors’ adverse impact on user speech. Public reporting on the accuracy of reviews and on the methods deployed to assess them would increase transparency and generate engagement with Meta that has the potential to lead to further improvements.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.

मामले के निर्णयों और नीति सलाहकार राय पर लौटें