Strategic Priorities

Our seven strategic priorities reflect an extensive, in-depth analysis of the issues raised by user appeals to the Oversight Board. As these priorities guide the cases we select, we encourage users to take them into account when submitting appeals.

Elections and Civic Space

Social media companies face challenges in consistently applying their policies to political expression in many parts of the world, including during elections and large-scale protests. These challenges are especially complex in countries and regions where there are credible complaints about the absence of effective mechanisms to protect human rights. As a Board, we are exploring Meta’s responsibilities in elections, protests and other key moments for civic participation.

Crisis and Conflict Situations

In times of crisis, such as armed conflict, terrorist attacks and health emergencies, social media can help people exchange critical information, debate important public issues and stay safe. However, it can also create an environment in which misinformation and hatred can spread. This is why we are considering Meta’s role in protecting freedom of expression in such circumstances, as well as the company’s preparedness for the potential harm its products can contribute to during armed conflicts, civil unrest and other emergencies. 


Women, non-binary and trans people are among those who experience obstacles to exercising their rights to freedom of expression on social media. We are exploring the obstacles faced by women and LGBTQIA+ people, including gender-based violence and harassment, and the effects of gender-based distinctions in content policy.

Hate Speech Against Marginalized Groups

Hate speech creates an environment of discrimination and hostility towards marginalized groups. It is often context-specific, coded and causes harm resulting from effects that build up over time. We are asking how Meta should protect members of marginalized groups while also ensuring its enforcement does not incorrectly target those challenging hate. At the same time, we are aware that restrictions on hate speech should not be over-enforced or used to limit the legitimate exercise of freedom of expression, including in the form of unpopular or controversial points of view.

Government Use of Meta’s Platform

Governments use Facebook and Instagram to convey their policies, and they make requests to Meta to remove content. We are considering how state actors use Meta’s platforms, how they might influence content moderation practices and policies – sometimes in non-transparent ways – and the implications of the state’s involvement in content moderation.

Treating Users Fairly

When people’s content is removed from Facebook, Instagram or Threads, they are not always told which specific rule they have broken. In other instances, users are not treated equally, given adequate procedural guarantees, or access to remedies for mistakes made. We are constantly pushing Meta to treat its users better by providing more specific user notifications, ensuring people can always appeal decisions and being more transparent in areas such as “strikes” and cross-check.

Automated Enforcement of Policies and Curation of Content

While algorithms are crucial to moderating content at scale, there is a lack of transparency and understanding of how Meta’s automated systems work and how they affect the content users see. Our final strategic priority covers how automated enforcement should be designed and reviewed, the accuracy and limitations of automated systems, and the importance of greater transparency in this area.

Working With Stakeholders on Our Strategic Priorities

We want to work with organizations to understand the areas in which Meta most urgently needs to improve, and what types of case could help address them.

To discuss how your organization can get involved, please contact