OVERTURNED
2023-053-FB-UA

Breast Self-Exam

A user appealed Meta’s decision to remove a Facebook post that included a video providing instructions on how to perform a breast self-examination.
OVERTURNED
2023-053-FB-UA

Breast Self-Exam

A user appealed Meta’s decision to remove a Facebook post that included a video providing instructions on how to perform a breast self-examination.
Policies and topics
Freedom of expression, Health, Sex and gender equality
Adult nudity and sexual activity
Region and countries
Europe
Spain
Platform
Facebook
Policies and topics
Freedom of expression, Health, Sex and gender equality
Adult nudity and sexual activity
Region and countries
Europe
Spain
Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post that included a video providing instructions on how to perform a breast self-examination. After the Board brought the appeal to Meta’s attention, the company reversed its earlier decision and restored the post.

Case Description and Background

In April 2014 - more than nine years ago - a Facebook user posted a video with a caption. The caption explains that the video provides instructions on how women should undertake a breast self-examination each month to check for breast cancer. The animated video depicts a nude female breast and gives information on breast cancer and when to reach out to a doctor. Additionally, the video specifies that a doctor’s advice should be followed. The post was viewed fewer than 500 times.

Nine years after it was first shared, Meta removed the post from the platform under its Adult Nudity and Sexual Activity policy, which prohibits “imagery of real nude adults” if it depicts “uncovered female nipples’’ except, among other reasons, for “breast cancer awareness” purposes. However, Meta has since acknowledged that the content falls within the allowance of raising breast cancer awareness and has restored the content to Facebook. It is unclear why the post was enforced nine years after its original posting.

In her appeal to the Board, the user expressed surprise at the content being taken down after nine years and stated that the purpose of posting the video was to educate women on conducting a breast self-examination, thereby enhancing their likelihood of detecting early-stage symptoms and ultimately saving lives. The user stated that “if they were male breasts, nothing would have happened.”

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

The case highlights Meta’s inconsistency in enforcing allowances for medical and health content, as permitted under the company’s Adult Nudity and Sexual Activity Community Standard. Women’s rights to freedom of expression and health are affected by such inconsistency. This case emphasizes the connection between these two rights and the necessity of effective content moderation to allow for the raising of awareness about a cause or for educational or medical reasons.

In one of its first case decisions, the Board issued recommendations related to Meta’s Adult Nudity and Sexual Activity policy, specifically on this issue. The Board urged Meta to improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer symptoms were not wrongly flagged for review, ( Breast Cancer Symptoms and Nudity, recommendation no. 1). In addition, the Board encouraged Meta to “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity, recommendation no. 5). Meta reported implementation on the first recommendation and published information to demonstrate it. For the second, the company described this as work it already does but did not publish information to demonstrate implementation.

The Board has also emphasized the importance of moderators reviewing user appeals submitted to Meta, specifically asking the company to ensure users can appeal decisions taken by automated systems to a human when their content is found to have violated Facebook’s Community Standard on Adult Nudity and Sexual Activity,” ( Breast Cancer Symptoms and Nudity, recommendation no. 4). Meta declined to implement this recommendation after assessing feasibility.

The Board reiterates that full implementation of these recommendations is necessary to help reduce the error rate of content wrongly removed under the allowance in the Adult Nudity and Sexual Activity Community Standard, to raise awareness or educate users about early symptoms of breast cancer.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.

Policies and topics
Freedom of expression, Health, Sex and gender equality
Adult nudity and sexual activity
Region and countries
Europe
Spain
Platform
Facebook
Policies and topics
Freedom of expression, Health, Sex and gender equality
Adult nudity and sexual activity
Region and countries
Europe
Spain
Platform
Facebook

This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors and inform the public about the impact of the Board’s work. They are approved by a Board Member panel, not the full Board. They do not involve a public comments process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.

Case Summary

A user appealed Meta’s decision to remove a Facebook post that included a video providing instructions on how to perform a breast self-examination. After the Board brought the appeal to Meta’s attention, the company reversed its earlier decision and restored the post.

Case Description and Background

In April 2014 - more than nine years ago - a Facebook user posted a video with a caption. The caption explains that the video provides instructions on how women should undertake a breast self-examination each month to check for breast cancer. The animated video depicts a nude female breast and gives information on breast cancer and when to reach out to a doctor. Additionally, the video specifies that a doctor’s advice should be followed. The post was viewed fewer than 500 times.

Nine years after it was first shared, Meta removed the post from the platform under its Adult Nudity and Sexual Activity policy, which prohibits “imagery of real nude adults” if it depicts “uncovered female nipples’’ except, among other reasons, for “breast cancer awareness” purposes. However, Meta has since acknowledged that the content falls within the allowance of raising breast cancer awareness and has restored the content to Facebook. It is unclear why the post was enforced nine years after its original posting.

In her appeal to the Board, the user expressed surprise at the content being taken down after nine years and stated that the purpose of posting the video was to educate women on conducting a breast self-examination, thereby enhancing their likelihood of detecting early-stage symptoms and ultimately saving lives. The user stated that “if they were male breasts, nothing would have happened.”

Board Authority and Scope

The Board has authority to review Meta's decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).

When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.

Case Significance

The case highlights Meta’s inconsistency in enforcing allowances for medical and health content, as permitted under the company’s Adult Nudity and Sexual Activity Community Standard. Women’s rights to freedom of expression and health are affected by such inconsistency. This case emphasizes the connection between these two rights and the necessity of effective content moderation to allow for the raising of awareness about a cause or for educational or medical reasons.

In one of its first case decisions, the Board issued recommendations related to Meta’s Adult Nudity and Sexual Activity policy, specifically on this issue. The Board urged Meta to improve the automated detection of images with text-overlay to ensure that posts raising awareness of breast cancer symptoms were not wrongly flagged for review, ( Breast Cancer Symptoms and Nudity, recommendation no. 1). In addition, the Board encouraged Meta to “implement an internal audit procedure to continuously analyze a statistically representative sample of automated content removal decisions to reverse and learn from enforcement mistakes,” ( Breast Cancer Symptoms and Nudity, recommendation no. 5). Meta reported implementation on the first recommendation and published information to demonstrate it. For the second, the company described this as work it already does but did not publish information to demonstrate implementation.

The Board has also emphasized the importance of moderators reviewing user appeals submitted to Meta, specifically asking the company to ensure users can appeal decisions taken by automated systems to a human when their content is found to have violated Facebook’s Community Standard on Adult Nudity and Sexual Activity,” ( Breast Cancer Symptoms and Nudity, recommendation no. 4). Meta declined to implement this recommendation after assessing feasibility.

The Board reiterates that full implementation of these recommendations is necessary to help reduce the error rate of content wrongly removed under the allowance in the Adult Nudity and Sexual Activity Community Standard, to raise awareness or educate users about early symptoms of breast cancer.

Decision

The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to the company’s attention.