Threads Users Can Now Appeal Meta’s Content Decisions To The Oversight Board  

The Board marks first time its remit has expanded to a new social media app since launching in 2020 with Facebook and Instagram 

 
People using Threads – Meta’s app for sharing text updates and joining public conversations – can now appeal the company’s content moderation decisions to the Oversight Board. This means users who seek to challenge Meta’s decisions to remove or leave up posts to Threads can request independent review. 

“The Board’s expansion to Threads builds on our foundation of helping Meta solve the most difficult questions around content moderation,” said Oversight Board Co-Chair Helle Thorning-Schmidt. “Having independent accountability early on for a new app such as Threads is vitally important.”   

“With conflicts raging in Europe, the Middle East, and elsewhere, billions of people heading to the polls in global elections, and growing abuse towards marginalized groups online, there is no time to lose,” added Oversight Board co-chair Catalina Botero Marino. “Advances in artificial intelligence can make content moderation even more challenging.  

“The Board is dedicated to finding concrete solutions that protect freedom of expression while reducing harm, and we look forward to setting standards that will improve the online experience for millions of Threads users.” 

The appeals process for Threads is similar to Facebook and Instagram. Once users have exhausted Meta’s internal appeals process, the company will issue an Oversight Board reference ID in their Support Inbox. This will allow users with active accounts to submit their case for review on the Oversight Board website. To date, the Board has reversed Meta’s original decision almost 80 percent of the time. In addition to the 130 million people using Threads, Meta will also be able to refer cases about content on Threads directly to the Board.  

The Board will hold Meta accountable to its Community Guidelines (which apply to Threads), the company's commitments on free expression and other human rights principles, which have always guided the Board’s work. 

As always, the Board will select cases that are emblematic of broader content moderation issues across Meta’s platforms, including questions about its policies and enforcement. By tackling problems shared by millions of users, and proposing solutions to them, the impact of the Board’s work is felt far beyond individual cases. 

Through its work, the Board has pushed Meta to introduce various changes. This includes the creation of an Account Status which tells people what penalties Meta has applied to their account and why. The company created new classifiers to stop breast cancer awareness content being automatically removed. And it is finalizing a new, consistent approach to preserving potential evidence of atrocities and serious violations of human-rights law 

Pursuing long term plans for scope expansion is a key goal for the Board. In 2022 the Board’s scope was expanded to apply warning screens marking certain posts as ‘disturbing’ or ‘sensitive’ when restoring or leaving them up.  

You can learn more about this announcement on our blog.  

Background  

The Oversight Board is an independent organization comprising global experts that hold Meta accountable to its Community Standards for Facebook and Instagram as well as to its human rights commitments. The Board has contractually binding authority over Meta’s decisions to remove or leave content up on its platforms. The Board also issues non-binding recommendations that shape Meta’s policies to ensure the company is more transparent, enforces its rules evenly and treats users more fairly. Meta has fully or partially implemented 67 of the Board’s recommendations made to date. The Board’s decisions are based on free speech and other human rights principles.  

Volver a Notas de prensa