Oversight Board Opens Public Comments for Policy Advisory Opinion on Cross-Check

In October, the Board accepted a request from Meta, in the form of a policy advisory opinion, to review the company’s cross-check system and make recommendations on how it can be changed. Shortly after, Meta released more details about this request and today the Board is opening public comments for this policy advisory opinion.

Beyond reviewing individual cases to remove or restore content, the Board can accept policy advisory opinion requests from Meta. After receiving input from external stakeholders, the Board provides detailed recommendations on changes Meta should make on its policies on a given topic.

Meta must send the Board’s recommendations through its official policy development process and give regular updates on this, including through its newsroom. While the Board’s policy advisory opinion is not binding, Meta must provide a public response and follow-on actions after receiving our recommendations.

Policy Advisory Opinion 2021-02

Submit public comment here.

*In this summary, Facebook refers to the social media platform while Meta refers to the company that owns and manages Facebook.

This policy advisory opinion request concerns Meta’s policy on cross-check, a system that the company says helps it “ensure that enforcement decisions about [Facebook’s] Community Standards are made accurately and with additional levels of human review.”

Meta told the Board that the cross-check system plays an important role in protecting “Voice” and human rights. The company also stated that the system serves an important role in managing relationships with its business partners. “Incorrectly removing content posted by a page or profile with a large following, for instance, can result in negative experiences for both Facebook’s business partners and the significant number of users who follow them.”

Meta explained that Facebook’s “primary review systems use technology to prioritize high-severity content, which includes ‘viral’ content that spreads quickly.” When systems flag such content for escalation, Facebook reviewers decide if the content should remain on the platform. Meta also explained that although it “aim[s] to make the right decisions, it recognizes that false positives [erroneous removal of non-violating content] do occur.” Cross-check is one of the systems Meta uses to prevent false positive mistakes on Facebook; it is not the only such system the company uses.

Meta stated that, historically, the company determined who should receive cross-check review by compiling lists of users or entities with higher associated risk of false positive actions against them. “‘False positive risk’ refers to the risk of incorrect enforcement against content or entities that do not actually violate” Facebook’s Community Standards. Meta applied a variety of criteria, including the type of user or entity (e.g., an elected official, journalist, significant business partner, human rights organization), the number of followers, and the subject matter of the entity. When users or entities on those lists posted content or took actions that Facebook’s systems flagged, they would be added to a queue for cross-check review.”

At the beginning of 2020, Meta: “made changes so that most content in the queue was prioritized using a risk framework, which assigned a level of false-positive risk that could result if Facebook incorrectly removed that content. This risk framework generally relied on three factors: (1) the sensitivity of the entity, (2) the severity of the alleged violation, and (3) the severity of the potential enforcement action. Based on those factors, the content would be assigned one of three tiers of review: low (reviewed by contract reviewers), medium (reviewed by our markets team who have specialized regional expertise), and high (reviewed by our markets team and Early Response team who have deeper policy expertise and the ability to factor in additional context). Within those review tiers, the content in the queue was then prioritized by potential policy violation severity.”

Reviewers would then examine the content, confirm whether it violated its policies, and if so, enforce those policies. This review, depending on case complexity, potentially could escalate all the way to the company’s leadership.

Meta rolled out additional changes to the cross-check system in 2021, after it conducted a holistic analysis of the system and identified opportunities for improvement. Meta implemented some changes, including breaking the cross-check into two components. “General Secondary Review” and “Early Response (ER) Secondary Review.”

ER Secondary Review is the historical cross-check system, described above, and will continue to be maintained using lists of entities (i.e. users, pages, groups, etc.). However, Meta changed the process of compiling and revising cross-check lists. “Prior to September 2020, most employees had the ability to add a user or entity to the cross-check list. After September 2020, while any employee can request that a user or entity be added to cross-check lists, only a designated group of employees have the authority to make additions to the list.” The lists relevant to ER Secondary Review contained more than 660,000 entities as of October 16, 2021. Meta explained that the lists are not static and change as entities are added and removed.

The General Secondary Review represents the majority of cross-check and will continue to grow. By the end of 2021, Meta aims to make the system available to all Instagram and Facebook users and entities. General Secondary Review will operate using a dynamic prioritization system called “cross-check ranker.” The “cross-check ranker ranks content based on false positive risk using criteria such as topic sensitivity (how trending/sensitive the topic is), enforcement severity (the severity of the potential enforcement action), false positive probability, predicted reach, and entity sensitivity (based largely on the compiled lists, described above).” The General Secondary review represents the majority of cross-checked content and entities.

In developing the cross-check ranker, Meta interviewed 14 internal stakeholders across the operations, policy, and product teams to better understand risks of over-enforcement. The company “chose internal stakeholders due to the complications of explaining how enforcement works, but [is] considering external engagement in the future.”

Meta concedes that despite investing significant resources to improve cross-check it still has difficulties striking a balance between removing content that violates Facebook’s polices “while ensuring that it continues to foster open communication and free expression.” The company asked the Board three questions, which the Board is including in its call for public comments.

Questions posed by Meta to the Board:

  • Because of the complexities of content moderation at scale, how should Meta balance its need to fairly and objectively apply Facebook’s Community Standards with the need for flexibility, nuance, and context-specific decisions within cross-check?
  • What improvements should Meta make to how it governs the Early Response (“ER”) Secondary Review cross-check system to fairly enforce Facebook’s Community Standards while minimizing the potential for over-enforcement, retaining business flexibility, and promoting transparency in the review process?
  • What criteria should Meta use to determine who is included in ER Secondary Review and prioritized as one of many factors by the company’s cross-check ranker in order to help ensure equity in access to this system and its implementation?

Board requests for public comments on the following issues:

  • Whether a cross-check system is needed and if it strengthens or undermines the protection of freedom of expression and other human rights.
  • Cross-check is designed to be a “false positive” prevention mechanism. What are the checks and balances, if any, this system should contemplate to mitigate the risks of “false negatives” [erroneous lack of action on violating content]?
  • Recommendations on what Meta should do to ensure that the cross-check system, including its escalation process, is neutral and free of political and other biases.
  • What factors should Meta incorporate into the “cross-check ranker” system in addition to topic sensitivity, enforcement severity, false-positive probability, predicted reach, and the nature and importance of the entity? How should these factors be defined?
  • The benefits and limitations of automated technologies used to prioritize review of high-severity content.
  • Information on how the cross-check system should and can be improved for users and entities who do not post in English.
  • Information on systems akin to cross-check used by other social media platforms and lessons learned that can be applicable to Meta.
  • How Meta can improve transparency of the cross-check system.
  • What additional research and resources should Meta dedicate to improving the cross-check system?

Public Comments

An important part of the Board’s process for developing a policy advisory opinion is gathering additional insights and expertise from individuals and organizations. This input will allow Board Members to tap into more knowledge and understand how Meta’s policies affect different people in different parts of the world.

If you or your organization feel that you can contribute valuable perspectives to this request for a policy advisory opinion, you can submit your contributions here.

The public comment window for this policy advisory opinion request on cross-check is open until 15:00 GMT, Friday January 14, 2022.

This timeline is longer than the public comment period for new cases as the policy advisory opinion process does not have the same time constraints as case decisions. Additionally, public comments can be up to six pages in length and submitted in any of the languages available on the Board’s website. This should allow broader participation on the issues at stake. The full list of languages is available through the link above.

How Does the Board Respond to a Request for a Policy Advisory Opinion?

1.The Board accepts Meta’s request for a policy advisory opinion

When Meta sends a policy advisory opinion request to the Board, it is assigned to one of the Board's four Co-Chairs. The relevant Co-Chair drafts a memo to all Board Members summarizing the main issues raised. Based on this, Board Members vote to accept or reject Meta's request.

2.Committee develops policy options

If the Board accepts the request, the Co-Chair assigns a committee of at least five Board Members. Board Members are invited to volunteer based on expertise and interest. Any remaining vacancies are assigned randomly to Board Members who are not Co-Chairs, while also ensuring gender diversity.

After appointing a lead drafter, the committee agrees additional information to request from Meta, as well as research from the Oversight Board Administration and plans for stakeholder engagement. Once the committee has received answers to their questions from Meta, the Administration and external stakeholders, it drafts policy options for the full Board to consider.

3. The Board deliberates and makes recommendations

All Board Members attend deliberations to discuss these policy options. Based on these deliberations, the lead drafter prepares a policy advisory opinion which includes detailed recommendations for Meta. Board Members offer feedback on this and propose edits.

4.Policy advisory opinion is approved and published

The policy advisory opinion is circulated to all Board Members who approve or reject it by a majority. If approved, the policy advisory opinion is published on the Board's website and Meta must respond to it. If rejected, Co-Chairs discuss the reasons for this and determine how to proceed.

What Next

The Board has accepted this request for a policy advisory opinion and is collecting the necessary information, including through the call for public comments which has launched today.

In June we accepted another request for a policy advisory opinion on the sharing of private residential information. We will be publishing this opinion early next year.

Return to News