A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.
A person scrutinizing a sphere she’s holding in her hand, while shapes and clouds float around her.

Oversight Board announces new cases about gender-based violence


April 2023

Today, the Board is announcing two new cases for consideration. As part of this, we are inviting people and organizations to submit public comments.

Case selectionCase selection

As we cannot hear every appeal, the Board prioritizes cases that have the potential to affect lots of users around the world, are of critical importance to public discourse, or raise important questions about Meta’s policies.

The first case we are announcing today is:

Image of gender-based violence

(2023-006-FB-UA)

User appeal to remove content from Facebook

Submit public comments here.

Please note: the public comment window for this case is open for 14 days, closing at 23:59 your local time on Thursday, May 11.

In May 2021, a Facebook user in Iraq posted a photo with a caption in Arabic. The photo shows a woman with visible marks of a physical attack, including bruises on her face and body. The caption begins by warning women about writing letters to their husbands. The caption states that the woman in the photo wrote a letter to her husband which the husband misunderstood, resulting in the physical attack on the woman. According to different preliminary translations, the post states the husband thought that the woman called him a “donkey” or that she asked him to bring her a “donkey.” It then says that in fact, she was asking him for a “veil.” In Arabic, the words for “donkey” and “veil” are similar. There are several laughing and smiling emojis throughout the post. The caption does not name the woman in the photo, but her face is clearly visible. The post has about 20,000 views, and under 1,000 reactions.

In February 2023, a Facebook user reported the content three times for violating the Violence and Incitement Community Standard. The reports were not prioritized for human review and were automatically closed by Meta, leaving the content on the platform. Meta has told the Board that it prioritizes appeals for human review based on certain criteria, including the severity of the violation and virality of the content. Appeals that are not prioritized for review within a certain time frame are automatically closed with no further action. The user who reported the content then appealed Meta’s decision to the Oversight Board. As a result of the Board selecting this case, Meta determined that its previous decision to leave the content on the platform was in error and removed the post.

Meta removed the content from Facebook under its Bullying and Harassment Community Standard. The Bullying and Harassment Community Standard prohibits “content that further degrades individuals who are depicted being physically bullied” or “content that praises, celebrates, or mocks their death or serious physical injury” when it targets private individuals or limited scope public figures. The policy provides the following examples of limited scope public figures: “individuals whose primary fame is limited to their activism, journalism, or those who become famous through involuntary means.” In conducting its review, Meta learned the woman depicted in the photograph is an activist in the region whose image had been shared on social media in the past.

The Board selected this case to explore Meta’s policies and practices in moderating content describing and joking about gender-based violence and its impact on the rights of users on and off Meta’s platforms. This case falls within the Board’s ‘Gender’ priority which is one of the Board’s seven strategic priorities.

The Board would appreciate public comments that address:

  • Meta’s policy and enforcement choices about content joking about or mocking gender-based violence.
  • The relationship between Facebook and Instagram content that jokes about or mocks gender-based violence and its effect on people who may be impacted by this content and their ability to use these platforms.
  • The relationship between Facebook and Instagram content that jokes about or mocks gender-based violence and its effect on off-platform gender-based violence.
  • How depictions of gender-based violence may be used to target public figures, human rights defenders, and activists.
  • Insights into the socio-political context in Iraq (and the region), regarding gender-based violence and its depiction on social media.

In its decisions, the Board can issue policy recommendations to Meta. While recommendations are not binding, Meta must respond to them within 60 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

“Violence against women” decision to consider two cases together “Violence against women” decision to consider two cases together

The Board has also decided to consider a second new case together with the case that we announced in March 2023. This means the forthcoming “violence against women” decision will consider two cases, the original case announced March 9 (2023-002-IG-UA) and a second case, announced today (2023-005-IG-UA). Adding the second case will allow the Board to explore how Meta’s Hate Speech policy impacts content discussing gender-based violence in more detail and with greater nuance and to provide comprehensive recommendations to Meta.

Public comments remain a key part of the process, and you can submit comments for the new case here until 23:59 your local time on Thursday, May 4. To give the panel sufficient time to consider any public comments submitted for this case, we will, exceptionally, only be opening public comments for seven days, instead of our usual 14 days. If you have already submitted your comments for the previously announced case, you are welcome to submit any additional thoughts through this channel.

The original case (2023-002-IG-UA) concerned a post including audio of a woman describing her experience in a violent intimate relationship, and a caption saying, "men murder, rape and abuse women mentally and physically – all the time, every day." Meta removed the post under its Hate Speech Community Standard. After the user appealed the decision to the Board, Meta reviewed its decision and told us it believes that removing the post had been an error.

After we announced the case, we found that the same user appealed another case (2023-005-IG-UA) to the Board. This concerned a post including a video of a woman acknowledging that she is a man-hater. She says that the difference between hating men and misogyny is that hating men is rooted in fear, because men murder and rape worldwide. Meta removed the post for violating its Hate Speech Community Standard. The user appealed to the Board. After the Board selected the case, Meta reviewed its decision again, and told us it believes it was correct.

What’s next What’s next

Over the next few weeks, Board Members will be deliberating these cases. Once they have reached their final decisions, we will post them on the Oversight Board website.

To receive updates when the Board announces new cases or publishes decisions, sign up here.

Back to news and articles