OVERTURNED
2021-004-FB-UA

Pro-Navalny protests in Russia

The Oversight Board has overturned Facebook's decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a 'cowardly bot'.
OVERTURNED
2021-004-FB-UA

Pro-Navalny protests in Russia

The Oversight Board has overturned Facebook's decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a 'cowardly bot'.
Policies and topics
Freedom of expression, News events, Politics
Bullying and harassment
Region and countries
Europe
Russia
Platform
Facebook
Policies and topics
Freedom of expression, News events, Politics
Bullying and harassment
Region and countries
Europe
Russia
Platform
Facebook

Case SummaryCase Summary

The Oversight Board has overturned Facebook’s decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook removed the comment for using the word “cowardly” which was construed as a negative character claim. The Board found that while the removal was in line with the Bullying and Harassment Community Standard, the current Standard was an unnecessary and disproportionate restriction on free expression under international human rights standards. It was also not in line with Facebook’s values.

About the case

On January 24, a user in Russia made a post consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23. Another user (the Protest Critic) responded to the root post and wrote that while they did not know what happened in Saint Petersburg, the protesters in Moscow were all school children, mentally “slow,” and were “shamelessly used.”

Other users then challenged the Protest Critic in subsequent comments to the root post. A user who was at the protest (the Protester) appeared to be the last to respond to the Protest Critic. They claimed to be elderly and to have participated in the protest in Saint Petersburg. The Protester ended the comment by calling the Protest Critic a “cowardly bot.”

The Protest Critic then reported the Protester’s comment to Facebook for bullying and harassment. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” and, since the “target” of the attack reported the content, Facebook removed it. The Protester appealed against this decision to Facebook. Facebook determined that the comment violated the Bullying and Harassment policy, under which a private individual can get Facebook to take down posts containing a negative comment on their character.

Key findings

This case highlights the tension between policies protecting people against bullying and harassment and the need to protect freedom of expression. This is especially relevant in the context of political protest in a country where there are credible complaints about the absence of effective mechanisms to protect human rights.

The Board found that, while Facebook’s removal of the content may have been consistent with a strict application of the Community Standards, the Community Standards fail to consider the wider context and disproportionately restricted freedom of expression.

The Community Standard on Bullying and Harassment states that Facebook removes negative character claims about a private individual when the target reports the content. The Board does not challenge Facebook’s conclusion that the Protest Critic is a private individual and that the term “cowardly” was a negative character claim.

However, the Community Standard did not require Facebook to consider the political context, the public character, or the heated tone of the conversation. Accordingly, Facebook did not consider the Protester’s intent to refute false claims about the protests or attempt to balance that concern against the reported negative character claim.

The decision to remove this content failed to balance Facebook’s values of “Dignity” and “Safety” against “Voice.” Political speech is central to the value of “Voice” and should only be limited where there are clear “Safety” or “Dignity” concerns.

"Voice” is also particularly important in countries where freedom of expression is routinely suppressed, as in Russia. In this case, the Board found that Facebook was aware of the wider context of pro-Navalny protests in Russia, and heightened caution should have led to a more careful assessment of content.

The Board found that Facebook’s Community Standard on Bullying and Harassment has a legitimate aim in protecting the rights of others. However, in this case, combining the distinct concepts of bullying and harassment into a single set of rules, which were not clearly defined, led to the unnecessary removal of legitimate speech.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

In a policy advisory statement, the Board recommends that, to comply with international human rights standards, Facebook should amend and redraft its Bullying and Harassment Community Standard to:

· Explain the relationship between its Bullying and Harassment policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

· Differentiate between bullying and harassment and provide definitions that distinguish the two acts. The Community Standard should also clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

· Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

· Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

· When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

· Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decision Full case decision

1. Decision summary

The Oversight Board has overturned Facebook’s decision to remove a comment where a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook clarified that the comment was removed for using the word “cowardly” which was construed as a negative character claim. The Board found that while the removal was in line with the Bullying and Harassment Community Standard, this Standard was an unnecessary and disproportionate restriction on freedom of expression under international human rights standards. It was also not in accordance with Facebook’s values.

2. Case Description

On January 24, a user in Russia made a post consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23. Another user (the Protest Critic) responded to the root post and wrote that while they did not know what happened in Saint Petersburg, the protesters in Moscow were all school children, mentally “slow,” and were “shamelessly used.” The Protest Critic added that the protesters were not the voice of the people but a “theatre show.”

Other users then challenged the Protest Critic in subsequent comments to the root post. These other users defended the protesters and stated that the Protest Critic was spreading nonsense and misunderstood the Navalny movement. The Protest Critic responded in several comments, repeatedly dismissing these challenges and referring to Navalny as a “pocket clown” and “rotten,” claiming that people supporting him have no self-respect. They also called people who brought their grandparents to the protests “morons.”

A user who was at the protest (the Protester) appeared to be the last to respond to the Protest Critic. They self-identified as elderly and as having participated in the protest in Saint Petersburg. They noted that there were many people at the protests, including disabled and elderly people, and that they were proud to see young people protesting. They said that the Protest Critic was deeply mistaken in thinking that young protesters had been manipulated. The Protester ended the comment by calling the Protest Critic a “cowardly bot.”

The Protest Critic then reported the Protester’s comment to Facebook for bullying and harassment. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” (i.e. not a public figure) and, since the “target” of the attack reported the content, Facebook removed it. Facebook did not find the term “bot” to be a negative character claim. The Protester appealed against this decision to Facebook. Facebook reviewed the appeal and determined that the comment violated the Bullying and Harassment policy. The content was reviewed within four minutes of the Protester requesting an appeal, which according to Facebook “falls within the standard timeframe” for reviewing content on appeal.

3. Authority and scope

The Board has the power to review Facebook’s decision following an appeal from the user whose post was removed (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). The Board may uphold or reverse that decision (Charter Article 3, Section 5).

The Board’s decisions are binding and may include policy advisory statements with recommendations. These recommendations are nonbinding, but Facebook must respond to them (Charter Article 3, Section 4).

The Board is an independent grievance mechanism to address disputes in a transparent and principled manner.

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The Community Standard on Bullying and Harassment is broken into two parts. It includes a policy rationale followed by a list of “Do nots,” which are specific rules around what content should not be posted and when it may be removed.

The policy rationale begins by stating that bullying and harassment can take many forms, including threatening messages and unwanted malicious contact. It then declares that Facebook does not tolerate this kind of behavior because it prevents people from feeling safe and respected.” The rationale also explains that Facebook approaches bullying and harassment of public and private individuals differently to allow open discussion of current events. The policy rationale adds that to protect private individuals, Facebook removes any content “that is meant to degrade or shame” them.

One of the “Do not” rules that follows the rationale declares that it is not permitted to “target private adults (who must self-report)” with “negative character or ability claims, except in the context of criminal allegations against adults.” The Community Standards do not define the meaning of a “negative character claim.” Further, Facebook explained to the Board that it “does not maintain an exhaustive list of which terms qualify as negative character claims.” Although “several of Facebook’s regionally focused operational teams maintain dynamic, non-exhaustive lists of terms in the relevant market language in order to provide guidance for terms which may be difficult to classify, such as terms that are new or used in a variety of ways.”

Facebook also has longer documents detailing the Internal Implementation Standards on Bullying and Harassment and how to apply the policy. These non-public guidelines define key terms and offer guidance and illustrative examples to moderators on what content may be removed under the policy. In an excerpt provided to the Board, a “negative character claim” was defined as “specific terms or descriptions that attack an individual's mental or moral qualities. This encompasses: disposition, temperament, personality, mentality, etc. Claims solely about an individual's actions are not encompassed, nor are criminal allegations.”

II. Facebook’s values

Facebook’s values are outlined in the introduction to the Community Standards.

The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Facebook limits “Voice” in service of four values, and two are relevant here:

“Safety”: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.

“Dignity” : We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

III. Human rights standards

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In March 2021, Facebook announced its Corporate Human Rights Policy, where it recommitted to respecting rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:

  • Freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); Human Rights Committee, General Comment No. 34 (2011); Special Rapporteur on freedom of opinion and expression, A/74/486 (2019); Special Rapporteur on violence against women, A/HRC/38/47 (2018); Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda, FOM.GAL/3/17 (2017);
  • The right of peaceful assembly: Article 21, ICCPR; Special Rapporteur on the rights to freedom of peaceful assembly and of association A/HRC/20/27 (2012);
  • The right to health: Article 12, International Covenant on Economic, Social and Cultural Rights (ICESCR).

5. User statement

In their appeal to the Board, the Protester explained that their comment was not offensive and simply refuted the false claims of the Protest Critic. The Protester claimed that the Protest Critic sought to prevent people from seeing contradictory opinions and was “imposing their opinions in many publications,” which made them think they were a paid bot with no actual first-hand experience of the protests.

6. Explanation of Facebook’s decision

Facebook stated that it removed the Protester’s comment for violating its Bullying and Harassment policy in line with its values of “Dignity” and “Safety.” It noted that the Community Standards require the removal of content that targets private adults with negative character claims whenever it is reported by the targeted person. A user is deemed to be targeted when they are referenced by name in the content. In this case Facebook stated that “cowardly” is “easily discerned to be a negative character claim” targeting the Protest Critic.

Facebook explained that they remove any content meant to degrade or shame private individuals if the targets report it themselves. The requirement for the targeted person to report the content was put in place to help Facebook better understand when people feel bullied or harassed.

Facebook justified prohibiting attacks on a user’s character on the ground that such attacks prevent people from feeling safe and respected on the platform, which decreases their likelihood of engaging in debate or discussion. Citing an article from an anti-bullying charity Ditch the Label, Facebook reiterated that bullying “undermines the right to freedom of expression . . . and creates an environment in which the self-expression of others—often marginalized groups—is suppressed.” Facebook also cited other research suggesting that users who have experienced harassment are likely to self-censor.

Facebook stated that by limiting content removals to cases where the target is a private adult who reports that they find the content harmful, the company ensures everyone’s “Voice” is heard. According to Facebook, this is reinforced by an appeals system that lets users request a review of content removed for violating the Bullying and Harassment policy to help prevent enforcement errors.

Facebook also stated that its decision was consistent with international human rights standards. Facebook stated that (a) its policy was publicly accessible, (b) the decision to remove the content was legitimate to protect the freedom of expression of others, and (c) the removal of the content was necessary to eliminate unwanted harassment. In Facebook’s view, its decision was proportionate as lesser measures would still expose the Protest Critic to harassment and potentially impact others who may see it.

7. Third-party submissions

The Board received 23 public comments on this case. Eight came from Europe, 13 from the US and Canada, one from Asia, and one from Latin America and the Caribbean. The submissions covered issues including, whether Facebook is contributing to silencing dissent in Russia and thereby supporting Russian President Vladimir Putin, the context of state-sponsored domestic social-media manipulation in Russia, and whether the content was serious enough to constitute bullying or harassment.

A range of organizations and individuals submitted comments, including activists, journalists, anti-bullying groups, and members of the Russian opposition.

To read public comments submitted for this case click here.

8. Oversight Board analysis

This case highlights the tension between policies protecting people against bullying and harassment and the need to protect freedom of expression. This is especially relevant in the context of a political protest in a country where there are credible complaints about the absence of effective and independent mechanisms for the protection of human rights.

The Board seeks to evaluate whether this content should be restored to Facebook through three lenses: Facebook's Community Standards; the company's values; and its human rights responsibilities.

8.1 Compliance with Community Standards

The Board found that Facebook’s removal of the content is consistent with the “Do not” rule prohibiting targeting private individuals with negative character claims. The Community Standard on Bullying and Harassment states that Facebook removes negative character claims aimed at a private individual when the target reports the content. If the same content is reported by a person who is not targeted, it will not be removed.

To the Board the term “cowardly” does not appear to be a serious or harmful term in the context of this case because of the tone of the discussion. Nevertheless, the Board does not challenge Facebook’s conclusion that the Protest Critic is a private individual and that the term “cowardly” may be construed as a negative character claim.

The Board recognizes the importance of the Bullying and Harassment policy. According to the National Anti-Bullying Research and Resource Centre, bullying and harassment are two distinct concepts. While there is no widely agreed definition of either bullying or harassment, common elements of academic definitions include willful and repeated attacks as well as power imbalances. These elements are not reflected in Facebook’s Community Standards.

Kate Klonick wrote that given the lack of a clear definition and the highly context-specific and subjective nature of harm, Facebook claimed that it had two choices: to keep-up up potentially harmful content in the interests of free expression, or to err on the side of removing all potentially harmful speech (even if some of that content turned out to be benign). Encouraged by some advocacy groups and media debate on cyber bullying, Facebook chose the latter option. The requirement that private individuals report content that targets them appears to be an attempt to limit the amount of benign content removed.

The Board appreciates the difficulties involved in setting policy in this area as well as the importance of protecting users’ safety. This particularly applies to women and vulnerable groups who are at higher risks of online bullying and harassment. However, the Board found that, in this case, the negative character claim was used in a heightened exchange on a matter of public issue and was no worse than the language used by the Protest Critic. The Protest Critic had voluntarily engaged in a debate on a matter of public interest. This case illustrates that Facebook’s blunt and decontextualized approach can disproportionately restrict freedom of expression. Enforcing the Community Standard appears limited to determining whether a single term is a negative character claim and whether it has been reported by the user targeted by the claim. There is no assessment of the wider context or conversation.

In this case, Facebook did not consider the Protest Critic’s derogatory language about pro-Navalny protesters. Facebook also did not consider the Protester’s intent to refute false claims about the protests spread by the Protest Critic nor made any attempt to balance that concern against the reported bullying. Instead, the company stated that this balancing exercise is undertaken when the Community Standards are drafted so that moderation decisions are made solely on the individual piece of content that has been reported. Ultimately, decisions to remove content seem to be made based on a single word if that word is deemed to be a negative character claim, regardless of the context of any exchange the content may be part of.

8.2 Compliance with Facebook’s values

The Board found that the Facebook’s decision to remove this content did not comply with Facebook’s values. Further, the company failed to balance the values of “Dignity” and “Safety” against “Voice.”

The Board found that political speech is central to the value of “Voice.” As such, it should only be limited where there are clear concerns around “Safety” or “Dignity.” In the context of an online political discussion, a certain level of disagreement should be expected. The Protest Critic vigorously exercised their voice, but was challenged and called a “cowardly bot.” While the Protester’s use of “cowardly” and “bot” could be seen as a negative character claim, it formed part of a broader exchange on an issue of public interest.

In relation to political matters “Voice” is particularly important in countries where freedom of expression is routinely suppressed. The Board considered well-documented instances of pro-government actors in Russia engaging in anti-opposition expression in online spaces. While there is no evidence of government involvement in this case, the general efforts of the Russian authorities to manipulate online discourse and drown out opposition voices provides crucial context for assessing Facebook’s decision to limit “Voice” in this instance.

The values of “Safety” and “Dignity” protect users from feeling threatened, silenced or excluded. Bullying and harassment is always highly context specific and can have severe impacts on the safety and dignity of those targeted. The Board notes that “the consequences of and harm caused by different manifestations of online violence are specifically gendered, given that women and girls suffer from particular stigma in the context of structural inequality, discrimination and patriarchy” (A/HRC/38/47, para. 25).

As the Protest Critic was not invited to provide a statement, the impact of this post on them is unknown. However, analysis of the comment thread shows the user actively engaged in a contentious political discussion and felt safe to attack and insult Navalny, his supporters, and January 23 protesters. The term “cowardly bot” may be generally considered insulting and offend the “Dignity” of the user who reported the content. However, the Board finds that the risk of likely harm to Protest Critic was minor considering the tone of the overall exchange.

8.3 Compliance with Facebook’s human rights responsibilities

The Board found that the removal of the Protestor’s content under the Bullying and Harassment Community Standard was not consistent with Facebook’s human rights responsibilities.

Freedom of expression (Article 19 ICCPR)

Article 19, para. 2, of the ICCPR provides broad protection for expression of “all kinds,” including political discourse and “free communication of information and ideas about public and political issues between citizens…is essential” (General Comment No. 34, para. 13). The UN Human Rights Committee has made clear the protection of Article 19 extends to expression that may be considered “deeply offensive” (General Comment No. 34, paras. 11, 12).

While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook should seek to align its policies on bullying and harassment with these principles (UN Special Rapporteur on freedom of expression, report A/74/486, para. 58(b)).

I. Legality

The principle of legality under international human rights law requires rules used to limit expression to be clear and accessible (General Comment No. 34, para. 25). People need to understand what is and what is not allowed. Additionally, precision in rulemaking ensures expression is not limited selectively. Here, however, the Board found Facebook’s Bullying and Harassment Community Standard to be unclear and overly complicated.

Overall, the Community Standard is organized in a way that makes it difficult to understand and follow. The policy rationale offers a broad understanding on what the Standard aims to achieve, which includes making users feels safe as well as preventing speech that degrades or shames. The rationale is then followed by a number of “Do nots” and additional rules under two yellow warning signs. These rules list prohibited content, when and how Facebook takes action, and degrees of protections enjoyed by distinct user groups. It is not made clear in the Community Standards if the aims of the rationale serve simply as guidance for the specific rules that follow, or if they must be interpreted conjunctively with the rules. Furthermore, the information is organized in a seemingly random order. For example, rules applicable to private individuals precede, follow and are sometimes mixed in with rules related to public figures.

The Community Standard fails to differentiate between bullying and harassment. As previously noted, experts on the subject agree that these are distinct behaviors. Further, as argued by civil society organization Article 19, the Community Standard falls below international standards on freedom of expression due to its lack of guidance on how bullying and harassment differ from threats or otherwise offensive speech. The Board finds that combining the distinct concepts of bullying and harassment into a single definition and corresponding set of rules has resulted in the removal of legitimate speech.

Furthermore, while the Bullying and Harassment policy applies differently to various categories of individuals and groups, it fails to define these categories. Other key terms, such as “negative character claim,” also lack clear definitions. Accordingly, the Board concludes that the Community Standard failed the test of legality.

II. Legitimate aim

Under international human rights law, any measure restricting expression must be for a purpose listed in Article 19, para. 3, of the ICCPR. Legitimate aims include the protection of the rights or reputations of others, as well as the protection of national security, public order, or public health or morals (General Comment No. 34, para. 28).

The Board accepts that the Bullying and Harassment Community Standard aims to protect the rights of others. Users' freedom of expression may be undermined if they are forced off the platform due to bullying and harassment. The policy also seeks to deter behavior that can cause significant emotional distress and psychological harm, implicating users’ right to health. However, the Board notes that any restrictions on freedom of expression must be drafted with care and the existence of a rule’s connection to a legitimate aim is not enough to satisfy human rights standards on freedom of expression. (General Comment No. 34, paras. 28, 30, 31, 32)

III.Necessity and proportionality

Any restrictions on freedom of expression "must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected" (General Comment 34, para. 34).

Facebook properly distinguishes between public and private individuals, but it does not recognize the context in which discussions may take place. For instance, in some circumstances private persons engaged in public debate over matters of public concern may open themselves up to criticism pertaining to their statement. The company narrowed the potential reach of its rule on negative character claims against private adults by requiring the targeted user to report content. The Board further notes that in addition to reporting abusive content, Facebook allows users to block or mute each other. This is a useful, albeit a limited tool against abuse. Because these options may be viewed as a less restrictive means to limit expression compared to other options, the content removal in this case was disproportionate.

Context is key for assessing necessity and proportionality. The UN Special Rapporteur on freedom of expression has stated in relation to hate speech that the “evaluation of context may lead to a decision to make an exception in some instances, when the content must be protected as, for example, political speech” (A/74/486, at para. 47(d)). This approach may be extended to assessments of bullying and harassment. In this case, Facebook should have considered the environment for freedom of expression in Russia generally, and specifically government campaigns of disinformation against opponents and their supporters, including in the context of the January protests. The Protest Critic’s engagement with the Protester in this case repeated the false claim that Navalny protesters were manipulated children. The accusation of “cowardly bot” in the context of a heated discussion on these issues was unlikely to cause harm, in particular given the equally hostile allegations and accusations from the Protest Critic.

Facebook notified the Board that in January 2021 it determined that potential mass nationwide protests in support of Navalny constituted a high-risk event and asked its moderators to flag trends and content where it was unclear if Community Standards had been violated. In March 2021, Facebook reported that it removed 530 Instagram accounts involved in coordinated inauthentic activities targeting pro-Navalny Russian users. Facebook was thus aware of the wider context of the content in this case, and heightened caution should have led to a more careful assessment of content related to the protests.

Additionally, the removed content appears to have lacked elements that often constitute bullying and harassment, such as repeat attacks or an indication of a power imbalance. While calling someone cowardly can be a negative character claim, the content was a culmination of a heated political exchange on current events in Russia. Considering the factors above, the Board concludes that Facebook’s decision to remove the content under its Bullying and Harassment Community Standard was unnecessary and disproportionate.

9.Oversight Board Decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

10. Policy advisory statement

To comply with international human rights standards, Facebook should amend and redraft its Bullying and Harassment Community Standard to:

1. Explain the relationship between the policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

2. Differentiate between bullying and harassment and provide definitions that distinguish the two acts. Further, the Community Standard should clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

3. Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

4. Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

5. When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

6. Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context.

Policies and topics
Freedom of expression, News events, Politics
Bullying and harassment
Region and countries
Europe
Russia
Platform
Facebook
Policies and topics
Freedom of expression, News events, Politics
Bullying and harassment
Region and countries
Europe
Russia
Platform
Facebook

Case SummaryCase Summary

The Oversight Board has overturned Facebook’s decision to remove a comment in which a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook removed the comment for using the word “cowardly” which was construed as a negative character claim. The Board found that while the removal was in line with the Bullying and Harassment Community Standard, the current Standard was an unnecessary and disproportionate restriction on free expression under international human rights standards. It was also not in line with Facebook’s values.

About the case

On January 24, a user in Russia made a post consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23. Another user (the Protest Critic) responded to the root post and wrote that while they did not know what happened in Saint Petersburg, the protesters in Moscow were all school children, mentally “slow,” and were “shamelessly used.”

Other users then challenged the Protest Critic in subsequent comments to the root post. A user who was at the protest (the Protester) appeared to be the last to respond to the Protest Critic. They claimed to be elderly and to have participated in the protest in Saint Petersburg. The Protester ended the comment by calling the Protest Critic a “cowardly bot.”

The Protest Critic then reported the Protester’s comment to Facebook for bullying and harassment. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” and, since the “target” of the attack reported the content, Facebook removed it. The Protester appealed against this decision to Facebook. Facebook determined that the comment violated the Bullying and Harassment policy, under which a private individual can get Facebook to take down posts containing a negative comment on their character.

Key findings

This case highlights the tension between policies protecting people against bullying and harassment and the need to protect freedom of expression. This is especially relevant in the context of political protest in a country where there are credible complaints about the absence of effective mechanisms to protect human rights.

The Board found that, while Facebook’s removal of the content may have been consistent with a strict application of the Community Standards, the Community Standards fail to consider the wider context and disproportionately restricted freedom of expression.

The Community Standard on Bullying and Harassment states that Facebook removes negative character claims about a private individual when the target reports the content. The Board does not challenge Facebook’s conclusion that the Protest Critic is a private individual and that the term “cowardly” was a negative character claim.

However, the Community Standard did not require Facebook to consider the political context, the public character, or the heated tone of the conversation. Accordingly, Facebook did not consider the Protester’s intent to refute false claims about the protests or attempt to balance that concern against the reported negative character claim.

The decision to remove this content failed to balance Facebook’s values of “Dignity” and “Safety” against “Voice.” Political speech is central to the value of “Voice” and should only be limited where there are clear “Safety” or “Dignity” concerns.

"Voice” is also particularly important in countries where freedom of expression is routinely suppressed, as in Russia. In this case, the Board found that Facebook was aware of the wider context of pro-Navalny protests in Russia, and heightened caution should have led to a more careful assessment of content.

The Board found that Facebook’s Community Standard on Bullying and Harassment has a legitimate aim in protecting the rights of others. However, in this case, combining the distinct concepts of bullying and harassment into a single set of rules, which were not clearly defined, led to the unnecessary removal of legitimate speech.

The Oversight Board’s decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

In a policy advisory statement, the Board recommends that, to comply with international human rights standards, Facebook should amend and redraft its Bullying and Harassment Community Standard to:

· Explain the relationship between its Bullying and Harassment policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

· Differentiate between bullying and harassment and provide definitions that distinguish the two acts. The Community Standard should also clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

· Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

· Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

· When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

· Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

*Case summaries provide an overview of the case and do not have precedential value.

Full case decision Full case decision

1. Decision summary

The Oversight Board has overturned Facebook’s decision to remove a comment where a supporter of imprisoned Russian opposition leader Alexei Navalny called another user a “cowardly bot.” Facebook clarified that the comment was removed for using the word “cowardly” which was construed as a negative character claim. The Board found that while the removal was in line with the Bullying and Harassment Community Standard, this Standard was an unnecessary and disproportionate restriction on freedom of expression under international human rights standards. It was also not in accordance with Facebook’s values.

2. Case Description

On January 24, a user in Russia made a post consisting of several pictures, a video, and text (root post) about the protests in support of opposition leader Alexei Navalny held in Saint Petersburg and across Russia on January 23. Another user (the Protest Critic) responded to the root post and wrote that while they did not know what happened in Saint Petersburg, the protesters in Moscow were all school children, mentally “slow,” and were “shamelessly used.” The Protest Critic added that the protesters were not the voice of the people but a “theatre show.”

Other users then challenged the Protest Critic in subsequent comments to the root post. These other users defended the protesters and stated that the Protest Critic was spreading nonsense and misunderstood the Navalny movement. The Protest Critic responded in several comments, repeatedly dismissing these challenges and referring to Navalny as a “pocket clown” and “rotten,” claiming that people supporting him have no self-respect. They also called people who brought their grandparents to the protests “morons.”

A user who was at the protest (the Protester) appeared to be the last to respond to the Protest Critic. They self-identified as elderly and as having participated in the protest in Saint Petersburg. They noted that there were many people at the protests, including disabled and elderly people, and that they were proud to see young people protesting. They said that the Protest Critic was deeply mistaken in thinking that young protesters had been manipulated. The Protester ended the comment by calling the Protest Critic a “cowardly bot.”

The Protest Critic then reported the Protester’s comment to Facebook for bullying and harassment. Facebook determined that the term “cowardly” was a negative character claim against a “private adult” (i.e. not a public figure) and, since the “target” of the attack reported the content, Facebook removed it. Facebook did not find the term “bot” to be a negative character claim. The Protester appealed against this decision to Facebook. Facebook reviewed the appeal and determined that the comment violated the Bullying and Harassment policy. The content was reviewed within four minutes of the Protester requesting an appeal, which according to Facebook “falls within the standard timeframe” for reviewing content on appeal.

3. Authority and scope

The Board has the power to review Facebook’s decision following an appeal from the user whose post was removed (Charter Article 2, Section 1; Bylaws Article 2, Section 2.1). The Board may uphold or reverse that decision (Charter Article 3, Section 5).

The Board’s decisions are binding and may include policy advisory statements with recommendations. These recommendations are nonbinding, but Facebook must respond to them (Charter Article 3, Section 4).

The Board is an independent grievance mechanism to address disputes in a transparent and principled manner.

4. Relevant standards

The Oversight Board considered the following standards in its decision:

I. Facebook’s Community Standards

The Community Standard on Bullying and Harassment is broken into two parts. It includes a policy rationale followed by a list of “Do nots,” which are specific rules around what content should not be posted and when it may be removed.

The policy rationale begins by stating that bullying and harassment can take many forms, including threatening messages and unwanted malicious contact. It then declares that Facebook does not tolerate this kind of behavior because it prevents people from feeling safe and respected.” The rationale also explains that Facebook approaches bullying and harassment of public and private individuals differently to allow open discussion of current events. The policy rationale adds that to protect private individuals, Facebook removes any content “that is meant to degrade or shame” them.

One of the “Do not” rules that follows the rationale declares that it is not permitted to “target private adults (who must self-report)” with “negative character or ability claims, except in the context of criminal allegations against adults.” The Community Standards do not define the meaning of a “negative character claim.” Further, Facebook explained to the Board that it “does not maintain an exhaustive list of which terms qualify as negative character claims.” Although “several of Facebook’s regionally focused operational teams maintain dynamic, non-exhaustive lists of terms in the relevant market language in order to provide guidance for terms which may be difficult to classify, such as terms that are new or used in a variety of ways.”

Facebook also has longer documents detailing the Internal Implementation Standards on Bullying and Harassment and how to apply the policy. These non-public guidelines define key terms and offer guidance and illustrative examples to moderators on what content may be removed under the policy. In an excerpt provided to the Board, a “negative character claim” was defined as “specific terms or descriptions that attack an individual's mental or moral qualities. This encompasses: disposition, temperament, personality, mentality, etc. Claims solely about an individual's actions are not encompassed, nor are criminal allegations.”

II. Facebook’s values

Facebook’s values are outlined in the introduction to the Community Standards.

The value of “Voice” is described as “paramount”:

The goal of our Community Standards has always been to create a place for expression and give people a voice. […] We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable.

Facebook limits “Voice” in service of four values, and two are relevant here:

“Safety”: We are committed to making Facebook a safe place. Expression that threatens people has the potential to intimidate, exclude or silence others and isn't allowed on Facebook.

“Dignity” : We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.

III. Human rights standards

The UN Guiding Principles on Business and Human Rights (UNGPs), endorsed by the UN Human Rights Council in 2011, establish a voluntary framework for the human rights responsibilities of private businesses. In March 2021, Facebook announced its Corporate Human Rights Policy, where it recommitted to respecting rights in accordance with the UNGPs. The Board's analysis in this case was informed by the following human rights standards:

  • Freedom of expression: Article 19, International Covenant on Civil and Political Rights ( ICCPR); Human Rights Committee, General Comment No. 34 (2011); Special Rapporteur on freedom of opinion and expression, A/74/486 (2019); Special Rapporteur on violence against women, A/HRC/38/47 (2018); Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda, FOM.GAL/3/17 (2017);
  • The right of peaceful assembly: Article 21, ICCPR; Special Rapporteur on the rights to freedom of peaceful assembly and of association A/HRC/20/27 (2012);
  • The right to health: Article 12, International Covenant on Economic, Social and Cultural Rights (ICESCR).

5. User statement

In their appeal to the Board, the Protester explained that their comment was not offensive and simply refuted the false claims of the Protest Critic. The Protester claimed that the Protest Critic sought to prevent people from seeing contradictory opinions and was “imposing their opinions in many publications,” which made them think they were a paid bot with no actual first-hand experience of the protests.

6. Explanation of Facebook’s decision

Facebook stated that it removed the Protester’s comment for violating its Bullying and Harassment policy in line with its values of “Dignity” and “Safety.” It noted that the Community Standards require the removal of content that targets private adults with negative character claims whenever it is reported by the targeted person. A user is deemed to be targeted when they are referenced by name in the content. In this case Facebook stated that “cowardly” is “easily discerned to be a negative character claim” targeting the Protest Critic.

Facebook explained that they remove any content meant to degrade or shame private individuals if the targets report it themselves. The requirement for the targeted person to report the content was put in place to help Facebook better understand when people feel bullied or harassed.

Facebook justified prohibiting attacks on a user’s character on the ground that such attacks prevent people from feeling safe and respected on the platform, which decreases their likelihood of engaging in debate or discussion. Citing an article from an anti-bullying charity Ditch the Label, Facebook reiterated that bullying “undermines the right to freedom of expression . . . and creates an environment in which the self-expression of others—often marginalized groups—is suppressed.” Facebook also cited other research suggesting that users who have experienced harassment are likely to self-censor.

Facebook stated that by limiting content removals to cases where the target is a private adult who reports that they find the content harmful, the company ensures everyone’s “Voice” is heard. According to Facebook, this is reinforced by an appeals system that lets users request a review of content removed for violating the Bullying and Harassment policy to help prevent enforcement errors.

Facebook also stated that its decision was consistent with international human rights standards. Facebook stated that (a) its policy was publicly accessible, (b) the decision to remove the content was legitimate to protect the freedom of expression of others, and (c) the removal of the content was necessary to eliminate unwanted harassment. In Facebook’s view, its decision was proportionate as lesser measures would still expose the Protest Critic to harassment and potentially impact others who may see it.

7. Third-party submissions

The Board received 23 public comments on this case. Eight came from Europe, 13 from the US and Canada, one from Asia, and one from Latin America and the Caribbean. The submissions covered issues including, whether Facebook is contributing to silencing dissent in Russia and thereby supporting Russian President Vladimir Putin, the context of state-sponsored domestic social-media manipulation in Russia, and whether the content was serious enough to constitute bullying or harassment.

A range of organizations and individuals submitted comments, including activists, journalists, anti-bullying groups, and members of the Russian opposition.

To read public comments submitted for this case click here.

8. Oversight Board analysis

This case highlights the tension between policies protecting people against bullying and harassment and the need to protect freedom of expression. This is especially relevant in the context of a political protest in a country where there are credible complaints about the absence of effective and independent mechanisms for the protection of human rights.

The Board seeks to evaluate whether this content should be restored to Facebook through three lenses: Facebook's Community Standards; the company's values; and its human rights responsibilities.

8.1 Compliance with Community Standards

The Board found that Facebook’s removal of the content is consistent with the “Do not” rule prohibiting targeting private individuals with negative character claims. The Community Standard on Bullying and Harassment states that Facebook removes negative character claims aimed at a private individual when the target reports the content. If the same content is reported by a person who is not targeted, it will not be removed.

To the Board the term “cowardly” does not appear to be a serious or harmful term in the context of this case because of the tone of the discussion. Nevertheless, the Board does not challenge Facebook’s conclusion that the Protest Critic is a private individual and that the term “cowardly” may be construed as a negative character claim.

The Board recognizes the importance of the Bullying and Harassment policy. According to the National Anti-Bullying Research and Resource Centre, bullying and harassment are two distinct concepts. While there is no widely agreed definition of either bullying or harassment, common elements of academic definitions include willful and repeated attacks as well as power imbalances. These elements are not reflected in Facebook’s Community Standards.

Kate Klonick wrote that given the lack of a clear definition and the highly context-specific and subjective nature of harm, Facebook claimed that it had two choices: to keep-up up potentially harmful content in the interests of free expression, or to err on the side of removing all potentially harmful speech (even if some of that content turned out to be benign). Encouraged by some advocacy groups and media debate on cyber bullying, Facebook chose the latter option. The requirement that private individuals report content that targets them appears to be an attempt to limit the amount of benign content removed.

The Board appreciates the difficulties involved in setting policy in this area as well as the importance of protecting users’ safety. This particularly applies to women and vulnerable groups who are at higher risks of online bullying and harassment. However, the Board found that, in this case, the negative character claim was used in a heightened exchange on a matter of public issue and was no worse than the language used by the Protest Critic. The Protest Critic had voluntarily engaged in a debate on a matter of public interest. This case illustrates that Facebook’s blunt and decontextualized approach can disproportionately restrict freedom of expression. Enforcing the Community Standard appears limited to determining whether a single term is a negative character claim and whether it has been reported by the user targeted by the claim. There is no assessment of the wider context or conversation.

In this case, Facebook did not consider the Protest Critic’s derogatory language about pro-Navalny protesters. Facebook also did not consider the Protester’s intent to refute false claims about the protests spread by the Protest Critic nor made any attempt to balance that concern against the reported bullying. Instead, the company stated that this balancing exercise is undertaken when the Community Standards are drafted so that moderation decisions are made solely on the individual piece of content that has been reported. Ultimately, decisions to remove content seem to be made based on a single word if that word is deemed to be a negative character claim, regardless of the context of any exchange the content may be part of.

8.2 Compliance with Facebook’s values

The Board found that the Facebook’s decision to remove this content did not comply with Facebook’s values. Further, the company failed to balance the values of “Dignity” and “Safety” against “Voice.”

The Board found that political speech is central to the value of “Voice.” As such, it should only be limited where there are clear concerns around “Safety” or “Dignity.” In the context of an online political discussion, a certain level of disagreement should be expected. The Protest Critic vigorously exercised their voice, but was challenged and called a “cowardly bot.” While the Protester’s use of “cowardly” and “bot” could be seen as a negative character claim, it formed part of a broader exchange on an issue of public interest.

In relation to political matters “Voice” is particularly important in countries where freedom of expression is routinely suppressed. The Board considered well-documented instances of pro-government actors in Russia engaging in anti-opposition expression in online spaces. While there is no evidence of government involvement in this case, the general efforts of the Russian authorities to manipulate online discourse and drown out opposition voices provides crucial context for assessing Facebook’s decision to limit “Voice” in this instance.

The values of “Safety” and “Dignity” protect users from feeling threatened, silenced or excluded. Bullying and harassment is always highly context specific and can have severe impacts on the safety and dignity of those targeted. The Board notes that “the consequences of and harm caused by different manifestations of online violence are specifically gendered, given that women and girls suffer from particular stigma in the context of structural inequality, discrimination and patriarchy” (A/HRC/38/47, para. 25).

As the Protest Critic was not invited to provide a statement, the impact of this post on them is unknown. However, analysis of the comment thread shows the user actively engaged in a contentious political discussion and felt safe to attack and insult Navalny, his supporters, and January 23 protesters. The term “cowardly bot” may be generally considered insulting and offend the “Dignity” of the user who reported the content. However, the Board finds that the risk of likely harm to Protest Critic was minor considering the tone of the overall exchange.

8.3 Compliance with Facebook’s human rights responsibilities

The Board found that the removal of the Protestor’s content under the Bullying and Harassment Community Standard was not consistent with Facebook’s human rights responsibilities.

Freedom of expression (Article 19 ICCPR)

Article 19, para. 2, of the ICCPR provides broad protection for expression of “all kinds,” including political discourse and “free communication of information and ideas about public and political issues between citizens…is essential” (General Comment No. 34, para. 13). The UN Human Rights Committee has made clear the protection of Article 19 extends to expression that may be considered “deeply offensive” (General Comment No. 34, paras. 11, 12).

While the right to freedom of expression is fundamental, it is not absolute. It may be restricted, but restrictions should meet the requirements of legality, legitimate aim, and necessity and proportionality (Article 19, para. 3, ICCPR). Facebook should seek to align its policies on bullying and harassment with these principles (UN Special Rapporteur on freedom of expression, report A/74/486, para. 58(b)).

I. Legality

The principle of legality under international human rights law requires rules used to limit expression to be clear and accessible (General Comment No. 34, para. 25). People need to understand what is and what is not allowed. Additionally, precision in rulemaking ensures expression is not limited selectively. Here, however, the Board found Facebook’s Bullying and Harassment Community Standard to be unclear and overly complicated.

Overall, the Community Standard is organized in a way that makes it difficult to understand and follow. The policy rationale offers a broad understanding on what the Standard aims to achieve, which includes making users feels safe as well as preventing speech that degrades or shames. The rationale is then followed by a number of “Do nots” and additional rules under two yellow warning signs. These rules list prohibited content, when and how Facebook takes action, and degrees of protections enjoyed by distinct user groups. It is not made clear in the Community Standards if the aims of the rationale serve simply as guidance for the specific rules that follow, or if they must be interpreted conjunctively with the rules. Furthermore, the information is organized in a seemingly random order. For example, rules applicable to private individuals precede, follow and are sometimes mixed in with rules related to public figures.

The Community Standard fails to differentiate between bullying and harassment. As previously noted, experts on the subject agree that these are distinct behaviors. Further, as argued by civil society organization Article 19, the Community Standard falls below international standards on freedom of expression due to its lack of guidance on how bullying and harassment differ from threats or otherwise offensive speech. The Board finds that combining the distinct concepts of bullying and harassment into a single definition and corresponding set of rules has resulted in the removal of legitimate speech.

Furthermore, while the Bullying and Harassment policy applies differently to various categories of individuals and groups, it fails to define these categories. Other key terms, such as “negative character claim,” also lack clear definitions. Accordingly, the Board concludes that the Community Standard failed the test of legality.

II. Legitimate aim

Under international human rights law, any measure restricting expression must be for a purpose listed in Article 19, para. 3, of the ICCPR. Legitimate aims include the protection of the rights or reputations of others, as well as the protection of national security, public order, or public health or morals (General Comment No. 34, para. 28).

The Board accepts that the Bullying and Harassment Community Standard aims to protect the rights of others. Users' freedom of expression may be undermined if they are forced off the platform due to bullying and harassment. The policy also seeks to deter behavior that can cause significant emotional distress and psychological harm, implicating users’ right to health. However, the Board notes that any restrictions on freedom of expression must be drafted with care and the existence of a rule’s connection to a legitimate aim is not enough to satisfy human rights standards on freedom of expression. (General Comment No. 34, paras. 28, 30, 31, 32)

III.Necessity and proportionality

Any restrictions on freedom of expression "must be appropriate to achieve their protective function; they must be the least intrusive instrument amongst those which might achieve their protective function; they must be proportionate to the interest to be protected" (General Comment 34, para. 34).

Facebook properly distinguishes between public and private individuals, but it does not recognize the context in which discussions may take place. For instance, in some circumstances private persons engaged in public debate over matters of public concern may open themselves up to criticism pertaining to their statement. The company narrowed the potential reach of its rule on negative character claims against private adults by requiring the targeted user to report content. The Board further notes that in addition to reporting abusive content, Facebook allows users to block or mute each other. This is a useful, albeit a limited tool against abuse. Because these options may be viewed as a less restrictive means to limit expression compared to other options, the content removal in this case was disproportionate.

Context is key for assessing necessity and proportionality. The UN Special Rapporteur on freedom of expression has stated in relation to hate speech that the “evaluation of context may lead to a decision to make an exception in some instances, when the content must be protected as, for example, political speech” (A/74/486, at para. 47(d)). This approach may be extended to assessments of bullying and harassment. In this case, Facebook should have considered the environment for freedom of expression in Russia generally, and specifically government campaigns of disinformation against opponents and their supporters, including in the context of the January protests. The Protest Critic’s engagement with the Protester in this case repeated the false claim that Navalny protesters were manipulated children. The accusation of “cowardly bot” in the context of a heated discussion on these issues was unlikely to cause harm, in particular given the equally hostile allegations and accusations from the Protest Critic.

Facebook notified the Board that in January 2021 it determined that potential mass nationwide protests in support of Navalny constituted a high-risk event and asked its moderators to flag trends and content where it was unclear if Community Standards had been violated. In March 2021, Facebook reported that it removed 530 Instagram accounts involved in coordinated inauthentic activities targeting pro-Navalny Russian users. Facebook was thus aware of the wider context of the content in this case, and heightened caution should have led to a more careful assessment of content related to the protests.

Additionally, the removed content appears to have lacked elements that often constitute bullying and harassment, such as repeat attacks or an indication of a power imbalance. While calling someone cowardly can be a negative character claim, the content was a culmination of a heated political exchange on current events in Russia. Considering the factors above, the Board concludes that Facebook’s decision to remove the content under its Bullying and Harassment Community Standard was unnecessary and disproportionate.

9.Oversight Board Decision

The Oversight Board overturns Facebook’s decision to remove the content, requiring the post to be restored.

10. Policy advisory statement

To comply with international human rights standards, Facebook should amend and redraft its Bullying and Harassment Community Standard to:

1. Explain the relationship between the policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

2. Differentiate between bullying and harassment and provide definitions that distinguish the two acts. Further, the Community Standard should clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

3. Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

4. Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

5. When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

6. Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

*Procedural note:

The Oversight Board's decisions are prepared by panels of five Members and approved by a majority of the Board. Board decisions do not necessarily represent the personal views of all Members.

For this case decision, independent research was commissioned on behalf of the Board. An independent research institute headquartered at the University of Gothenburg and drawing on a team of over 50 social scientists on six continents, as well as more than 3,200 country experts from around the world, provided expertise on socio-political and cultural context.