Multiple Case Decision

Educational posts about ovulation

In this summary decision, the Board is considering two educational posts about ovulation together. The Board believes that Meta’s original decisions to remove each post makes it more difficult for people to access a highly stigmatized area of health information for women. After the Board brought these two appeals to Meta’s attention, the company reversed its earlier decisions and restored both posts.

2 cases included in this bundle

Overturned

FB-YZ2ZBZWN

Case about adult nudity and sexual activity on Facebook

Platform
Facebook
Topic
Health,Sex and gender equality
Standard
Adult nudity and sexual activity
Location
United States,Pakistan
Date
Published on November 16, 2023
Overturned

IG-F5NPUOXQ

Case about adult nudity and sexual activity on Instagram

Platform
Instagram
Topic
Health,Sex and gender equality
Standard
Adult nudity and sexual activity
Location
Argentina
Date
Published on November 16, 2023

This is a summary decision. Summary decisions examine cases where Meta reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not consider public comments, and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas of potential improvement in its policy enforcement.

Case summary

In this summary decision, the Board is considering two educational posts about ovulation together. The Board believes that Meta’s original decisions to remove each post makes it more difficult for people to access a highly stigmatized area of health information for women. After the Board brought these two appeals to Meta’s attention, the company reversed its earlier decisions and restored both posts.

Case description and background

For the first case, on March 15, 2023, a Facebook user based in the United States commented on a post in a Facebook group. The comment was written in English and included a photo of four different types of cervical mucus and corresponding fertility levels, with a description of each overlaid on the photo. The comment was in response to someone else’s post, which asked about PCOS (Polycystic Ovary Syndrome), fertility issues, and vaginal discharge. The content had no views, no shares, and had been reported once by Meta’s automated systems. The group states that its purpose is to help provide women in Pakistan who suffer from “invisible conditions” related to reproductive health such as “endometriosis, adenomyosis, PCOS and other menstrual issues” with a safe space to discuss the challenges they face and to support one another.

For the second case, on March 7, 2023, an Instagram user posted a video depicting someone’s hand over a sink with vaginal discharge on the person’s fingers. The caption underneath the video is written in Spanish and the headline reads, "Ovulation - How to Recognize It?" The rest of the caption describes in detail how cervical mucus becomes clearer during ovulation, and at what point in the menstrual cycle someone can expect to be ovulating. It also describes other physiological changes one can expect when experiencing ovulation such as an increased libido and body temperature, and difficulty sleeping. The description for the user’s account says that it is dedicated to vaginal/vulvar health and period/menstruation education. The content had more than 25,000 views, no shares, and had been reported once by Meta’s automated systems.

For both cases, Meta initially removed each of the two pieces of content under its Adult Nudity and Sexual Activity policy, which prohibits “imagery of sexual activity” except “in cases of medical or health context.” However, Meta acknowledged that both pieces of content fall within its allowance for sharing imagery with the presence of by-products of sexual activity (which may include vaginal secretions) in a medical or health context and restored them back to each platform.

After the Board brought these two cases to Meta’s attention, the company determined that neither piece of content violated the Adult Nudity and Sexual Activity Community Standard and the removals were incorrect. The company then restored both pieces of content to Facebook and Instagram respectively.

Board authority and scope

The Board has authority to review Meta's decisions following an appeal from the user whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

Where Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, to reduce errors and increase fairness for people who use Facebook and Instagram.

Case significance

These cases highlight the difficulties of enforcing allowances for medical and health content set out in Meta’s Adult Nudity and Sexual Activity guidelines. As the user in the first case wrote in their appeal to the Board, understanding the appearance and texture of cervical mucus helps women track their cycles for ovulation and fertility. They also note that not all women have the means or resources to learn this information from a healthcare physician, or to purchase ovulation kits or have bloodwork done to track ovulation. Meta’s initial decision to remove this content makes it more difficult for people to access what is already a highly stigmatized area of health information for women.

Previously, the Board has issued recommendations related to both the Adult Nudity and Sexual Activity policy for the purposes of educating and raising awareness of medical and health information, as well as to improve the enforcement of allowances set out in the company’s Community Standards. Specifically, the Board has urged Meta to improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer symptoms were not wrongly flagged for review (“ Breast cancer symptoms and nudity,” recommendation no. 1) and to ensure that appeals based on policy exceptions are prioritized for human review (“‘ Two buttons meme’,” recommendation no. 5). While Meta is currently assessing the feasibility of the second recommendation, Meta has completed work on the first recommendation. Meta deployed a new image-based health content classifier and enhanced an existing text-overlay classifier to further improve Instagram’s techniques for identifying breast cancer context content. Over 30 days in 2023, these enhancements contributed to an additional 3,500 pieces of content being sent for human review that would have previously been automatically removed.

The full implementation of both recommendations will help reduce the error rate of content that is wrongly removed when it is posted under an allowance in the Community Standard, such as for raising awareness or educating users about various aspects of women’s reproductive health.

Decision

The Board overturns Meta’s original decisions to remove the two pieces of content. The Board acknowledges Meta’s correction of its initial errors once the Board brought these cases to the company’s attention.

Return to Case Decisions and Policy Advisory Opinions