Overturned
Azov Removal
December 8, 2023
A user appealed Meta’s decision to remove an Instagram post asking, “where is Azov?” in Ukrainian. The post's caption calls for soldiers of the Azov Regiment in Russian captivity to be returned.
This is a summary decision. Summary decisions examine cases in which Meta has reversed its original decision on a piece of content after the Board brought it to the company’s attention. These decisions include information about Meta’s acknowledged errors. They are approved by a Board Member panel, not the full Board. They do not involve a public comment process and do not have precedential value for the Board. Summary decisions provide transparency on Meta’s corrections and highlight areas in which the company could improve its policy enforcement.
Case Summary
A user appealed Meta’s decision to remove an Instagram post asking, “where is Azov?” in Ukrainian. The post's caption calls for soldiers of the Azov Regiment in Russian captivity to be returned. After the Board brought the appeal to Meta’s attention, the company reversed its original decision and restored the post.
Case Description and Background
In December 2022, an Instagram user created a post with an image of the Azov Regiment symbol. Overlaying the symbol was text in Ukrainian asking, “where is Azov?” The caption stated that more than 700 Azov soldiers remain in Russian captivity, with their conditions unknown. The user calls for their return, stating: “we must scream until all the Azovs are back from captivity!”
The user appealed the removal of the post, emphasizing the importance of sharing information during times of war. The user also highlighted that the content did not violate Meta’s policies, since Meta allows content commenting on the Azov Regiment. The post received nearly 800 views and was detected by Meta’s automated systems.
Meta originally removed the post from Facebook under its Dangerous Organizations and Individuals (DOI) policy, which prohibits content that "praises,” “substantively supports” or “represents” individuals and organizations that Meta designates as dangerous. However, Meta allows “discussions about the human rights of designated individuals or members of designated dangerous entities, unless the content includes other praise, substantive support or representation of designated entities or other policy violations, such as incitement to violence.”
Meta told the Board that it removed the Azov Regiment from its Dangerous Organizations and Individuals list in January 2023. A Washington Post article states that Meta now draws a distinction between the Azov Regiment, which it views as under formal control of the Ukrainian government, and other elements of the broader Azov movement, some which the company considers far-right nationalists and still designates as dangerous.
After the Board brought this case to Meta’s attention, the company determined that its removal was incorrect and restored the content to Instagram. The company acknowledged that the Azov Regiment is no longer designated as a dangerous organization. Additionally, Meta recognized that regardless of the Azov Regiment’s designation, this post falls under the exception that allows references to dangerous individuals and organizations when discussing the human rights of individuals and members of designated entities.
Board Authority and Scope
The Board has authority to review Meta’s decision following an appeal from the person whose content was removed (Charter Article 2, Section 1; Bylaws Article 3, Section 1).
When Meta acknowledges that it made an error and reverses its decision on a case under consideration for Board review, the Board may select that case for a summary decision (Bylaws Article 2, Section 2.1.3). The Board reviews the original decision to increase understanding of the content moderation processes involved, reduce errors and increase fairness for Facebook and Instagram users.
Case Significance
This case highlights shortcomings in the updating of Meta’s Dangerous Organizations and Individuals list and its enforcement, which raises greater concerns during times of war. The case also illustrates the systemic challenges in enforcing exceptions to Meta’s policy on Dangerous Organizations and Individuals.
Previously, the Board issued a recommendation stating that Meta’s Dangerous Organizations and Individuals policy should allow users to discuss alleged human rights abuses of members of dangerous organizations ( Öcalan’s Isolation decision, recommendation no. 5), which Meta committed to implement. Furthermore, the Azov Regiment was removed from Meta’s Dangerous Organizations and Individuals list in January 2023. The Board has issued a recommendation stating that, when any new policy is adopted, internal guidance and training should be provided to content moderators ( Öcalan’s Isolation decision, recommendation no. 8). The Board has also issued recommendations on the enforcement accuracy of Meta’s policies by calling for further transparency regarding enforcement error rates on the “praise” and “support” of dangerous individuals and organizations ( Öcalan’s Isolation decision, recommendation no. 12), and the implementation of an internal audit procedure to learn from past automated enforcement mistakes ( Breast Cancer Symptoms and Nudity decision, recommendation no. 5). Fully implementing these recommendations could help Meta decrease the number of similar content moderation errors.
Decision
The Board overturns Meta’s original decision to remove the content. The Board acknowledges Meta’s correction of its initial error once the Board brought the case to Meta’s attention.