Manipulated Video Should Have High-Risk Label

The Oversight Board has upheld Meta’s decision to leave up a Facebook post that shared a manipulated video during a political crisis in the Philippines but notes that the content should have been labeled “High-Risk” because of the significant potential to deceive users on a matter of public importance. The Board recommends that Meta publicly explain its manipulated media labels and how it applies them. The video should have been prioritized for fact-checking, and the Board finds it is near-identical to other fact-checked content. Additionally, Meta should give fact-checkers better tools to address misleading content.

About the Case

In March 2025, former Philippine President Rodrigo Duterte was extradited to the International Criminal Court in the Netherlands to face charges for alleged crimes against humanity during his term in office from 2016 to 2022. A few days after the arrest, a Facebook user reshared a manipulated video that had been posted by another user. The reshared video contains footage from a Serbian protest that was unrelated to the Duterte arrest, with captions and audio added to make it appear to be a pro-Duterte demonstration taking place in the Netherlands.

The video had a text overlay that said, “Netherland.” In the added audio, people are repeatedly chanting “Duterte”, while the song, “Bayan Ko,” plays in the Tagalog language. “Bayan Ko” was popular during anti-martial law protests in the Philippines in the 1980s.

The original post, which was shared hundreds of times and received about 100,000 views, was flagged by Meta’s automated systems as possible misinformation. Meta included the content in the online queue for fact-checking. Separately, Meta temporarily lowered the visibility of the post in non-US users’ Facebook feeds. Various similar videos went viral, and several were rated false by Meta’s fact-checking partners in the Philippines. However, due to the high volume of posts in the queue, fact-checkers were not able to review this specific post. Another Facebook user reported the reshared post for spreading misinformation. Meta left the post up, after which the user appealed. A human reviewer considered the appeal and upheld the initial decision. The user then appealed to the Oversight Board.

Key Findings

The Board agrees with Meta that the post should have been left up because it did not include the types of content prohibited by Meta’s Misinformation policy, such as discussing voting locations, processes or candidate eligibility. However, the Board notes that in addition to referring the content for fact-checking and temporarily showing the post lower on users’ feeds, Meta should have applied a “High-Risk” label to the content because it contained a digitally altered, photorealistic video with a high risk of deceiving the public during a significant public event.

Given the importance of providing transparency around its manipulated media labelling, the Board recommends that Meta describe its different labels and criteria for applying them. Currently, the most detailed information on Meta’s manipulated media labeling is in the Board’s decisions.

Meta should have taken more steps to ensure the post was fact-checked. Although Meta prioritizes similar content for fact-checking during elections, the high-profile arrest of a former head of state, and other political crises that are “timely, trending and consequential,” should be treated as critical events that qualify for heightened checks. Additionally, after its review, the Board finds the case content should also have qualified as near-identical to previously fact-checked content and been labeled as such, yet recognizes that Meta faces challenges in making this determination at scale.

The Board notes that manipulated video can be part of concerted misinformation campaigns, in which similar, but not identical, content is posted and shared with subtle tweaks to evade fact-checking. This makes it imperative that Meta has robust processes to address viral misleading posts, including prioritizing identical or near-identical content for review, and applying all its relevant policies and related tools. Fact-checkers should also be given better tools to rapidly identify viral content that is likely to be repeating misleading claims.

The Oversight Board’s Decision

The Board upholds Meta’s decision to leave up the content.

The Board also recommends that Meta:

  • Describe the different informative labels that the company uses for manipulated media, and when it applies them.
  • Build a separate queue within the fact-checking interface that includes content that is similar, but not identical or near-identical, to content that has already been fact-checked in a given market.

Further Information

To read public comments for this case, click here.

العودة إلى الأخبار