Strategic Priorities

As a Board we select (and periodically reevaluate and revise) strategic priorities as a reflection of the most impactful issues emerging from appeals made to us. We develop further expertise on these priorities, which informs our work as we push Meta to treat users fairly and respect freedom of expression. These priorities guide the cases we select. 

Automation and AI

With content being generated by AI and many of Meta’s content moderation decisions made by automated systems, the Board has consistently called for transparency around these processes. We have also pushed Meta to address inaccurate automated enforcement, which has highlighted the impact of such errors on different groups. Transparency and accuracy are even more pressing given that online platforms are deploying
AI-powered tools that can generate potentially harmful content in massive quantities and make decisions about enforcement that can unduly limit users’ freedom of expression. Platforms must respect freedom of expression and embed human rights when they design and deploy these models, and as they adapt their policies and enforcement approaches.

The Rights and Interests of Young People

The Board’s newest priority captures and responds to the rising interest globally in enhancing and protecting teens’ online experiences, considering how technology companies and their platforms can impact and influence them. Protecting their rights to freedom of expression and access to information, while considering policies that improve their safety from exploitation, abuse and other offline harms is one of the greatest challenges facing digital platforms today.

Crisis and Conflict Situations

In times of crisis, like armed conflicts, civil unrest, terrorist attacks and natural disasters, use
of social media can help people exchange information, debate public matters and document human rights abuses. However, mis/disinformation, calls to violence and hatred can have more immediate and severe negative effects during these periods. Our work has already resulted in Meta introducing a Crisis Policy Protocol, helping its platforms better prepare for crises. We are exploring the adequacy of Meta’s efforts during conflicts to detect serious threats against humanitarian workers and vulnerable communities, and account for international humanitarian law.

Elections and Civic Space

Safeguarding the flow of political information on social media is essential, in particular around elections when voting can be influenced by factors such as suppression of content or the spread of mis/disinformation. Platforms must respond rapidly when electoral integrity is under threat and speech is at risk, especially during large-scale protests. From the outset, the Board has urged Meta to consider context when moderating political content, to prevent overenforcement and protect dissent, and to limit underenforcement of violating posts. Our work also addresses the right to participate in public life – testing the effectiveness of Meta’s policies in these areas.

Gender

Technology-facilitated abuse based on gender and sex not only silences individual voices but can also be coordinated to suppress public debate, particularly when journalists, politicians, activists and human rights defenders are targeted. Such forms of intimidation, harassment and bullying are of concern to the Board, alongside cases that reveal ongoing obstacles to the freedom of expression of women and LGBTQIA+ people.

Government Influence and Pressure on Social Media Platforms

Our cases to date have raised questions about Meta’s relationship with governments, including on content takedown requests, due process and transparency. The Board will scrutinize government pressure on platforms to shape and enforce content policies, especially in indirect or non-transparent ways, as well as coordinated campaigns to suppress or promote speech in line with state interests.

Hate Speech Against Marginalized Groups

Hate speech can create a discriminatory environment online, and in its worst instances lead to serious violence and atrocities offline. This is why addressing speech that causes serious harm, particularly with regard to marginalized groups, is critical to our work. Equally, we seek to ensure these policies allow room for legitimate public debate, including when viewpoints are controversial or unpopular, and to avoid the restriction of counter-speech.

Working With Stakeholders on Our Strategic Priorities

We want to work with organizations to understand the areas in which Meta most urgently needs to improve, and what types of case could help address them.

To discuss how your organization can get involved, please contact engagement@osbadmin.com