Oversight Board Upholds Facebook Decision in Depiction of Zwarte Piet Case

The Oversight Board has upheld Facebook’s decision to remove specific content that violated the express prohibition on posting caricatures of Black people in the form of blackface, contained in its Hate Speech Community Standard.

About the case

On December 5, 2020, a Facebook user in the Netherlands shared a post including text in Dutch and a 17-second-long video on their timeline. The video showed a young child meeting three adults, one dressed to portray “Sinterklaas” and two portraying “Zwarte Piet,” also referred to as “Black Pete.”

The two adults portraying Zwarte Piets had their faces painted black and wore Afro wigs under hats and colorful renaissance-style clothes. All the people in the video appear to be white, including those with their faces painted black. In the video, festive music plays and one Zwarte Piet says to the child, “[l]ook here, and I found your hat. Do you want to put it on? You’ll be looking like an actual Pete!”

Facebook removed the post for violating its Hate Speech Community Standard.

Key findings

While Zwarte Piet represents a cultural tradition shared by many Dutch people without apparent racist intent, it includes the use of blackface which is widely recognized as a harmful racial stereotype.

Since August 2020, Facebook has explicitly prohibited caricatures of Black people in the form of blackface as part of its Hate Speech Community Standard. As such, the Board found that Facebook made it sufficiently clear to users that content featuring blackface would be removed unless shared to condemn the practice or raise awareness.

A majority of the Board saw sufficient evidence of harm to justify removing the content. They argued the content included caricatures that are inextricably linked to negative and racist stereotypes, and are considered by parts of Dutch society to sustain systemic racism in the Netherlands. They took note of documented cases of Black people experiencing racial discrimination and violence in the Netherlands linked to Zwarte Piet. These included reports that during the Sinterklaas festival Black children felt scared and unsafe in their homes and were afraid to go to school.

A majority found that allowing such posts to accumulate on Facebook would help create a discriminatory environment for Black people that would be degrading and harassing. They believed that the impacts of blackface justified Facebook’s policy and that removing the content was consistent with the company’s human rights responsibilities.

A minority of the Board, however, saw insufficient evidence to directly link this piece of content to the harm supposedly being reduced by removing it. They noted that Facebook’s value of “Voice” specifically protects disagreeable content and that, while blackface is offensive, depictions on Facebook will not always cause harm to others. They also argued that restricting expression based on cumulative harm can be hard to distinguish from attempts to protect people from subjective feelings of offense.

The Board found that removing content without providing an adequate explanation could be perceived as unfair by the user. In this regard, it noted that the user was not told that their content was specifically removed under Facebook’s blackface policy.

The Oversight Board’s decision

The Oversight Board upholds Facebook’s decision to remove the content.

In a policy advisory statement, the Board recommends that Facebook:

  • Link the rule in the Hate Speech Community Standard prohibiting blackface to its reasoning for the rule, including the harms the company seeks to prevent.
  • Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing, in line with the Board’s recommendation in case 2020-003-FB-UA. Where Facebook removes content for violating its rule on blackface, any notice to users should refer to this specific rule, and link to resources that explain the harm this rule seeks to prevent. Facebook should also provide a detailed update on its “feasibility assessment” of the Board’s prior recommendations on this topic.

For further information:

To read the full case decision, click here.

To read a synopsis of public comments for this case, click here.

Return to News