Overturned
Comment Targeting Wheelchair User
April 30, 2026
A user appealed Meta’s decision to leave up an Instagram comment, which mocked an individual for their use of a wheelchair.
Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention and include information about Meta’s acknowledged errors. They are approved by a Board Member panel, rather than the full Board, do not involve public comments and do not have precedential value for the Board. Summary decisions directly bring about changes to Meta’s decisions, providing transparency on these corrections, while identifying where Meta could improve its enforcement.
Summary
A user appealed Meta’s decision to leave up an Instagram comment, which mocked an individual for their use of a wheelchair. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and removed the comment.
About the Case
This case regards a comment written in response to a video posted on Instagram. In September 2025, a user posted a video clip featuring two individuals, one of whom is a wheelchair user and the other who can walk, with text overlay that lists out four advantages to dating someone who uses a wheelchair. The video appears to be produced in a way that is designed to be humorous and affectionate. The case content at issue which is a response to the video states: "And she never run [sic] away [salute emoji]."
In their statement to the Board, the appealing user noted that comments such as the one in this case "stigmatize disabled individuals or people that have disabilities and make them feel less than human." They also noted that comments such as the one at issue imply "sexual assault or rape threats" due to the wheelchair user’s mobility obstacles.
Meta initially left the comment on the platform. However, after the Board brought this case to Meta's attention, the company determined it violated its policies. Under Meta’s Bullying and Harassment policy, the company does not allow "celebration or mocking of death or medical condition." The company noted the case comment statement that a wheelchair user will "never run away" suggests that they are unable to leave due to a lack of physical mobility. Meta concluded that this comment implicitly mocks an individual who uses a wheelchair because of their medical condition, and as such, violates the company’s policy. Meta then removed the comment from Instagram.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the user who reported content that was left up (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges it made an error and reverses its decision in a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation process, reduce errors and increase fairness for Facebook, Instagram and Threads users.
Significance of Case
Meta’s failure to remove the comment from Instagram illustrates the company’s challenges in enforcing its Bullying and Harassment Community Standard. It highlights that inaccurate moderation in this area can contribute to an already exclusionary environment and dehumanizing treatment that wheelchair users and those with other physical impairments encounter in both offline and online spaces, infringing on their rights and dignity. According to research by the non-governmental organization Scope UK, three out of four people with disabilities (72%) have experienced negative attitudes or behavior in the last 5 years due to their disability (the report was published in 2022). Nine out of ten people with disabilities (87%) who had experienced negative attitudes or behavior due to their disability said it had a negative effect on their daily lives. According to the report, "From occasional looks or stares to more severe accusations and verbal or physical abuse. It all adds up, making people with disabilities feel isolated, lonely, and withdrawn from society."
The Board has already emphasized the importance of removing harmful content that mocks an individual based on a medical condition in the Image of Gender-Based Violence case. In the decision, the Board recommended that "to ensure clarity for users, Meta should explain that the term 'medical condition,' as used in the Bullying and Harassment Community Standard, includes ‘serious physical injury.’ While the internal guidance explains to content moderators that ‘medical condition’ includes 'serious physical injury,' this explanation is not provided to Meta’s users" (recommendation no. 1). The implementation of this recommendation is in progress. Meta reported that it is "in the process of reviewing its external Bullying and Harassment Community Standard with new policy lines, including clarifying language that 'medical conditions' includes 'serious physical injury' in line with internal guidance" (Meta’s H1 2025 Report [Appendix] on the Oversight Board).
Moreover, in its Gender Identity Debate Videos decision, the Board recommended that Meta "allow users to designate connected accounts, which are able to flag potential Bullying and Harassment violations requiring self-reporting on their behalf" (recommendation no. 3). Implementation of this recommendation is in progress. Meta reported that "allowing others to report on behalf of a person is technically difficult given the way [its] review systems function at scale and may be subject to abuse, but [the company] will explore options and provide an update on this work in a future report" (Meta’s H1 2025 Report [Appendix] on the Oversight Board).
The Board believes that the full implementation of the recommendation on clarity around public-facing language of the Bullying and Harassment Community Standard would provide users with a greater understanding about potential violations by their content under the policy. Also, implementation of the recommendation on designating connected accounts would further strengthen the company’s ability to reduce underenforcement of bullying and harassment-related content.
Decision
The Board overturns Meta’s original decision to leave up the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.