A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

Oversight Board announces two cases: Altered Video of President Biden and Weapons Post Linked to Sudan’s Conflict


October 2023

Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case selectionCase selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Meta's policies.

The cases that we are announcing today are:

Altered Video of President Biden

2023-029-FB-UA

User appeal to remove content from Facebook

Submit public comments, which can be provided anonymously, here.

On October 29, 2022, President Biden went to vote early in-person during the 2022 midterm elections in the United States, accompanied by his adult granddaughter, a first-time voter. After they voted, they exchanged “I Voted” stickers. President Biden placed a sticker on his granddaughter, above her chest, according to her instruction, and kissed her on the cheek. This moment was captured on video.

In May 2023, a Facebook user posted a seven-second altered version of that clip. The footage has been altered so that it loops, repeating the moment when President Biden’s hand makes contact with his granddaughter’s chest. The altered video plays to a short excerpt of the song “Simon Says” by Pharoahe Monch, which has the lyric “Girls rub on your titties.” The caption that accompanies the video states that President Biden is “a sick pedophile” for touching his granddaughter’s breast in the way he does. It also questions the people who voted for him, saying they are “mentally unwell.”

A user reported the content to Meta, but the company did not remove the post. The reporting user appealed and a human reviewer upheld the decision not to remove the content. As of early September 2023, the post had had fewer than 30 views, and had not been shared. The same user then appealed to the Oversight Board, stating the video was manipulated.

After the Board selected this case, Meta confirmed its decision to leave the content on the platform was correct. According to Meta’s assessment, the Manipulated Media Community Standard did not warrant removal of the content because it only applies to videos generated by artificial intelligence or to those in which a subject is shown saying words they did not say. Meta decided that the Misinformation or Bullying and Harassment Community Standards did not apply in this case either. Additionally, the content was not reviewed by independent fact-checkers as part of Meta's fact-checking program, although Meta did acknowledge that available news coverage indicates the video has been altered.

The Board selected this case to assess whether Meta’s policies adequately cover altered videos that could mislead people into believing politicians have taken actions, outside of speech, that they have not. This case falls within the Board’s elections and civic space and automated enforcement of policies and curation of content priorities.

The Board would appreciate public comments that address:

  • Research into online trends of using altered or manipulated video content to influence the perception of political figures, especially in the United States.
  • The suitability of Meta’s misinformation policies, including on manipulated media, to respond to present and future challenges in this area, particularly in the context of elections.
  • Meta’s human rights responsibilities when it comes to video content that has been altered to create a misleading impression of a public figure, and how they should be understood with developments in generative artificial intelligence in mind.
  • Challenges to and best practices in authenticating video content at scale, including by using automation.
  • Research into the efficacy of alternative responses to political disinformation or misinformation beyond content removal, such as fact-checking programs or labelling (also known as “inform treatments”). Additionally, research on avoiding bias in such responses.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Weapons Post Linked to Sudan’s Conflict

2023-028-FB-UA

User appeal to restore content from Facebook

Submit public comments, which can be provided anonymously, here.

To read this announcement in Arabic, click here.

لقراءة هذا الإعلان باللغة العربية، انقر هنا

In June 2023, a Facebook user posted an illustration of a bullet, with notes in Arabic identifying its different components. The caption for the post, also in Arabic, provides instructions on how to empty a gun’s cartridge of its bullet and use the components to create a Molotov cocktail-type device – a simple incendiary, typically in a bottle, that can be easily made. There is an additional note on how to throw the device safely, and the caption also states, “victory for the Sudanese people” and for the “armed people forces.” The content had only a few views before being removed by Meta.

The content refers to the user’s homeland of Sudan. In April 2023, fighting broke out in the country’s capital between the Sudanese Armed Forces and the paramilitary group, the Rapid Support Forces (RSF). Other groups have since joined the armed conflict, which has left thousands dead and forced more than four million people to flee.

On the day the content was posted, a “hostile speech classifier” enforcing three of Facebook's Community Standards – Hate Speech, Violence and Incitement, and Bullying and Harassment – determined it violated one. The post was removed by Meta for violating Facebook's Violence and Incitement Community Standard. Following the removal, Meta applied a standard strike and a three-day feature limit to the content creator’s profile, which prevented them from interacting with groups, and from creating or joining any messenger rooms. The user immediately appealed Meta’s decision. This led to a human reviewer assessing the post. Meta confirmed it was correct to initially remove the content, but this time for a violation of the Restricted Goods and Services policy. The user then appealed to the Board.

After the Board brought the case to Meta’s attention, the company determined that its original decision to remove the content under the Violence and Incitement Community Standard was correct. Under this policy, Meta removes content that includes instructions on “how to make or use explosives” or “how to make or use weapons if there is evidence of a goal to seriously injure or kill people.”

The Board selected this case to assess Meta’s policies on weapons-related content and the company’s enforcement practices in the context of conflicts. This case falls within the Board’s seven strategic priorities, specifically “crisis and conflict situations.”

The Board would appreciate public comments that address:

  • Insights into Sudan’s socio-political context and among communities of the Sudanese diaspora, the country’s ongoing conflict, and the potential for offline harm being caused by similar content to the one in this case.
  • How international humanitarian law (also known as the law of armed conflict) should inform Meta’s responsibilities when it comes to moderating weapons-related content in the context of armed conflicts.
  • Meta’s enforcement of its content policies for Arabic-language expression in relation to the situation in Sudan, as well as the company’s use of automation to enforce its rules.
  • The impact of content moderation on users’ abilities to engage in online discussion of armed conflict in Sudan and elsewhere, and Meta’s human rights responsibilities in this context.

As part of its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

Public commentsPublic comments

If you or your organization feel that you can contribute valuable perspectives that can help with reaching a decision on the cases announced today, you can submit your contributions using the links above. Please note that public comments can be provided anonymously. The public comment window is open for 14 days, closing at 23:59 your local time on Tuesday 24 October.

What’s nextWhat’s next

Over the next few weeks, Board members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website. To receive updates when the Board announces new cases or publishes decisions, sign up here.

Attachments

Arabic-Language Translation
Download
Back to news and articles