Multiple Case Decision

Reports of Israeli Rape Victims

A user appealed Meta’s decisions to remove two Facebook posts that describe sexual violence carried out by Hamas militants during the October 7, 2023, terrorist attacks on Israel. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored the posts.

2 cases included in this bundle

Overturned

FB-YCJP0Q9D

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Sex and gender equality,Violence,War and conflict
Standard
Dangerous individuals and organizations
Location
Israel,Palestinian Territories
Date
Published on April 4, 2024
Overturned

FB-JCO2RJI1

Case about dangerous individuals and organizations on Facebook

Platform
Facebook
Topic
Sex and gender equality,Violence,War and conflict
Standard
Dangerous individuals and organizations
Location
Israel,United States,Palestinian Territories
Date
Published on April 4, 2024

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.

Case Summary

A user appealed Meta’s decisions to remove two Facebook posts that describe sexual violence carried out by Hamas militants during the October 7, 2023, terrorist attacks on Israel. After the Board brought the appeals to Meta’s attention, the company reversed its original decisions and restored the posts.

Case Description and Background

In October 2023, a Facebook user uploaded two separate posts, one after the other, containing identical content featuring a video of a woman describing the rape of Israeli women committed by Hamas during the terrorist attacks on October 7. The caption contains a “trigger warning” and the speaker in the video warns users about the graphic content. The video goes on to show footage of two different women being kidnapped by Hamas, with one clip involving a woman severely injured lying face down in a truck and another an injured woman being dragged from the back of a vehicle. These images were widely shared in the aftermath of the attack. The first post was shared about 4,000 times and the second post had less than 50 shares.

Both posts were initially removed by Meta for violating the Dangerous Organizations and Individuals Community Standard. Under this policy, the company prohibits third-party imagery depicting the moment of a designated terror attack on identifiable victims under any circumstances, even if shared to condemn or raise awareness of the attack. Additionally, under Meta’s Violence and Incitement Community Standard, the company removes “content that depicts kidnappings or abductions if it is clear that the content is not being shared by a victim or their family as a plea for help, or shared for informational, condemnation or awareness-raising purposes.”

At the outset of the Hamas attack on October 7, Meta began strictly enforcing its Dangerous Organizations and Individuals policy on videos showing moments from individual attacks on visible victims. Meta explained this approach in its Newsroom post on October 13, saying it had done so “in order to prioritize the safety of those kidnapped by Hamas.” The two pieces of content in these two cases were therefore removed for violating Meta’s Dangerous Organizations and Individuals policy.

Following this decision, many news outlets began broadcasting related footage and users also started posting similar content to raise awareness and condemn the attacks. As a result, on or around October 20, Meta updated its policies to allow users to share this footage only within the context of raising awareness or to condemn the atrocities, and applied a warning screen to inform users that the footage may be disturbing. Meta published this change to its policy in a December 5 update to its original Newsroom post from October 13 (see Hostages Kidnapped From Israel for additional information and background).

Meta initially removed both pieces of content from Facebook in these two cases. The user appealed Meta’s decisions to the Board. After the Board brought these cases to Meta’s attention, the company determined the posts no longer violated its policies, given the updated allowance, and restored them both.

Board Authority and Scope

The Board has authority to review Meta’s decisions following appeals from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

These two cases highlight challenges in Meta’s ability to enforce content in a high-risk conflict situation that is constantly and rapidly evolving. As the Board found in its expedited decision, Hostages Kidnapped from Israel, Meta’s initial policy prohibiting content depicting hostages protected the dignity of those hostages and aimed to ensure they were not exposed to public curiosity. However, the Board also found that, in exceptional circumstances, when a compelling public interest or the vital interest of hostages requires it, temporary and limited exceptions can be justified. Given the context, restoring this type of content to the platform with a “mark as disturbing” warning screen is consistent with Meta’s content policies, values and human rights responsibilities. This would also be consistent with international humanitarian law and the practice of preserving documentation of alleged violations for future accountability, as well as increasing public awareness. In that case, the Board also noted that Meta took too long to roll out the application of this exception to all users and that the company’s rapidly changing approach to content moderation during the conflict has been accompanied by an ongoing lack of transparency.

Previously, the Board has issued recommendations that are relevant to this case. The Board recommended that Meta announce exceptions to its Community Standards, noting “their duration and notice of their expiration, in order to give people who use its platforms notice of policy changes allowing certain expression,” ( Iran Protest Slogan, recommendation no. 5). Meta has partially implemented this recommendation as demonstrated through published information. The Board has also previously recommended that Meta preserve evidence of potential war crimes, crimes against humanity and grave violations of human rights in the interest of future accountability ( Sudan Graphic Video, recommendation no. 1 and Armenian Prisoners of War Video, recommendation no. 1). Meta has agreed to implement this recommendation and the work is still in progress. The Board emphasizes the need for Meta to act on these recommendations to ensure that content regarding human rights is enforced accurately on its platforms.

Decision

The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s corrections of its initial errors once the Board brought the two cases to Meta’s attention.

Return to Case Decisions and Policy Advisory Opinions