In recent years, Facebook has had to grapple with major content management issues: conspiracy theorists and politicians running false ads, as well as the dissemination of doctored videos.
In response to these criticisms, the company has tried to regulate content based on its community standards, often being denounced for its decisions. Issues arise, Facebook responds—always imperfectly in the eyes of its critics—and outrage ensues.
It’s how their system has always worked. But that’s going to change soon.
This summer, Facebook plans to create an independent oversight board to adjudicate what content is allowed on the site, based on their community standards. The board is being called Facebook’s “Supreme Court.”
In a charter released last September, Facebook announced the board’s structure. Here’s how the new system will work.
Once the board is in place, Facebook can still restrict and allow content as it pleases. However, both the original poster or the person who flagged the content will be able to petition the board to take up their case. The board won’t take up every submission, and as such, will choose “cases that have the greatest potential to guide future decisions and policies,” according to the charter.
On needed expertise, the founding document says, “Members must have demonstrated experience at deliberating thoughtfully and as an open-minded contributor on a team; be skilled at making and explaining decisions based on a set of policies or standards; and have familiarity with matters relating to digital content and governance, including free expression, civic discourse, safety, privacy and technology.”
Initial board members will be appointed by Facebook. After that, a sub-committee of the board will select them from candidates suggested by the public and Facebook. The board will consist of 11 members at the beginning and can hold a total of up to 40 if need be. Cases will take about 90 days to be decided, but in instances “[where] content could result in urgent real-world consequences,” cases will be expedited.
Once the board comes to a decision, it will direct Facebook to allow or remove the post and all similar content. Those decisions will be binding.
To protect the board’s autonomy, Facebook plans to create an independent trust, which will be responsible for appointing members and funding the board. Trustees will be appointed by Facebook, and the trust will also have the ability to remove members if they violate Facebook’s code of conduct.
The trust will be an important part of this structure since it is Facebook’s attempt to separate itself from the board. On the question of whether the board can be trusted to independently adjudicate case, Rebekah Tromble, Associate Director of the Institute for Data, Democracy, and Politics at George Washington University, expressed skepticism.
“It’s not wrong to be skeptical about that and continue to put the oversight board under scrutiny, but I’m not sure I can think of an ideal alternative to that model that would really function properly,” Tromble said.
Tromble said Facebook may consider ad hoc appointments of members to handle specific cases where more expertise is needed but lamented the difficulties that would bring.
“On an ad hoc basis, trying to identify people who can step in in a myriad of cases all around the world, it’s just going to pose a logistical challenge. But if they don’t do that, then there’s always going to be a great deal of, I think, valid skepticism about the knowledge and understanding of cultural contexts that’s informing any of the decisions.”
As the board would represent over 2 billion users, the issue of diversity, both culturally and proportionally, looms large.
“The Board will be an advocate for our community — supporting people’s right to free expression, and making sure we fulfill our responsibility to keep people safe,” said founder and CEO Mark Zuckerberg, in a letter regarding the charter. “As an independent organization, we hope it gives people confidence that their views will be heard, and that Facebook doesn’t have the ultimate power over their expression. Just as our board of Directors keeps Facebook accountable to our shareholders, we believe the Oversight Board can do the same for our community.”
In January, Steven Levy, a reporter at Wired, covered a workshop Facebook held to explain the board to the public, where it touched on the demographics of the board. “Facebook seemed to think it was…people like us, in the room—well-educated, comfortable technocrats or public policy wonks. You can bet that some of the members will come from human-rights backgrounds,” Levy wrote.
Others, like Kate Klonic, a professor at St. John’s University School of Law, have suggested the creation of the independent board may serve to insulate Facebook from future criticism.
When the board was announced, Klonic, who received extensive access to Facebook’s governance team, wrote in a New York Times op-ed that it was “in the best interests of Facebook” as “such a tribunal would be a convenient scapegoat for contentious decisions.”
“‘Don’t like how we dealt with the takedown of the Alex Jones pages? Don’t blame us! It was the Council!’” she added.