A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.
A graphic depiction of a line-drawn circle encompassing circles, rectangles and a triangle.

Oversight Board upholds Facebook decision: Case 2020-003-FB-UA


January 2021

The Oversight Board has upheld Facebook’s decision to remove a post containing a demeaning slur which violated Facebook’s Community Standard on Hate Speech.

About the case

In November 2020, a user posted content which included historical photos described as showing churches in Baku, Azerbaijan. The accompanying text in Russian claimed that Armenians built Baku and that this heritage, including the churches, has been destroyed. The user used the term “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed are nomads and have no history compared to Armenians.

The user included hashtags in the post calling for an end to Azerbaijani aggression and vandalism. Another hashtag called for the recognition of Artsakh, the Armenian name for the Nagorno-Karabakh region, which is at the center of the conflict between Armenia and Azerbaijan. The post received more than 45,000 views and was posted during the recent armed conflict between the two countries.

Key findings

Facebook removed the post for violating its Community Standard on Hate Speech, claiming the post used a slur to describe a group of people based on a protected characteristic (national origin).

The post used the term "тазики" (“taziks”) to describe Azerbaijanis. While this can be translated literally from Russian as “wash bowl,” it can also be understood as wordplay on the Russian word “азики” (“aziks”), a derogatory term for Azerbaijanis which features on Facebook’s internal list of slur terms. Independent linguistic analysis commissioned on behalf of the Board confirms Facebook’s understanding of "тазики" as a dehumanizing slur attacking national origin.

The context in which the term was used makes clear it was meant to dehumanize its target. As such, the Board believes that the post violated Facebook’s Community Standards.

The Board also found that Facebook’s decision to remove the content complied with the company’s values. While Facebook takes “Voice” as a paramount value, the company’s values also include “Safety” and “Dignity.”

From September to November 2020, fighting over the disputed territory of Nagorno-Karabakh resulted in the deaths of several thousand people, with the content in question being posted shortly before a ceasefire.

In light of the dehumanizing nature of the slur and the danger that such slurs can escalate into physical violence, Facebook was permitted in this instance to prioritize people's "Safety" and "Dignity" over the user's "Voice."

A majority of the Board found that the removal of this post was consistent with international human rights standards on limiting freedom of expression.

The Board believed it is apparent to users that using the term “тазики” to describe Azerbaijanis would be classed as a dehumanizing label for a group belonging to a certain nationality, and that Facebook had a legitimate aim in removing the post.

The majority of the Board also viewed Facebook’s removal of the post as necessary and proportionate to protect the rights of others. Dehumanizing slurs can create an environment of discrimination and violence which can silence other users. During an armed conflict, the risks to people’s rights to equality, security of person and, potentially, life are especially pronounced.

While the majority of the Board found that these risks made Facebook’s response proportionate, a minority believed that Facebook’s action did not meet international standards and was not proportionate. A minority thought Facebook should have considered other enforcement measures besides removal.

The Oversight Board’s decision

The Board upholds Facebook’s decision to remove the content.

In a policy advisory statement, the Board recommends that Facebook:

  • Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. In this case, the user was informed that the post violated Facebook’s Community Standard on Hate Speech but was not told that this was because the post contained a slur attacking national origin. Facebook’s lack of transparency left its decision open to the mistaken belief that the company removed the content because the user expressed a view it disagreed with.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Back to news and articles