Facebook on Wednesday announced the first 20 members of its Oversight Board, an independent body that can overturn the company's own content moderation decisions.
The oversight board will govern appeals from Facebook and Instagram users and questions from Facebook itself, although it admitted it will have to pick and choose which content moderation cases to take due to the sheer volume of them.
The board will receive cases through a content management system that is linked to Facebook's own platforms. They will then discuss the case as a group before issuing a final decision on whether the content should be allowed to stay up or not.
Facebook announced it was creating the independent board in November 2018, just after a report was published in The New York Times that detailed how the company avoided and deflected blame in the public conversation around its handling of Russian interference and other social network misuses.
The members are a globally diverse group with lawyers, journalists, human rights advocates and other academics. Between them, they are said to have expertise in areas such as digital rights, religious freedom, conflicts between rights, content moderation, internet censorship and civil rights.
Notable members include Alan Rusbridger, former editor in chief of The Guardian newspaper, and Andras Sajo, a former judge and VP of the European Court of Human Rights.
Helle Thorning-Schmidt, a former Prime Minister of Denmark, is one of the board's four co-chairs. "Up until now some of the most difficult decisions about content have been made by Facebook and you could say Mark Zuckerberg," she said on a call with the press Wednesday. "Facebook has decided to change that."
The board will begin hearing cases in the coming months. It will eventually have around 40 members, which Facebook will help pick, it said.
"It's one thing to complain about content moderation and challenges involved, it's another thing to actually do something about it," said Jamal Greene, co-chair of the board. "These problems of content moderation really have been with us since the dawn of social media, and this really is a novel approach."
The move could help Facebook avoid accusations of bias as it removes content deemed problematic. Some lawmakers and conservative speakers have said that Facebook censors politically conservative points of view, a claim the company rejects.
"It is our ambition and goal that Facebook not decide elections, not be a force for one point of view over another, but the same rules will apply to people of left, right and center," Michael McConnell, another co-chair of the board, told reporters Wednesday.
Facebook pledged to give the board $130 million in funding last December, with the money expected to cover operational costs for at least six years. The board will be compensated an undisclosed amount for their time.
Facebook in January outlined the board's bylaws, making it clear that the social media giant is still in control. The board's decisions do not necessarily set any precedents that Facebook has to follow in the future, and the board is limited when it comes to content it can address.
The board said it will publish transparency reports each year and monitor what Facebook has done with its recommendations.
"It will be very embarrassing to Facebook if they don't live up to their end of this," Thorning-Schmidt, a co-chair, said.
Brent Harris, Facebook's director of global affairs, said Facebook will implement the board's decisions "unless they violate the law."
The full list of members includes: