by Larry Magid
This post first appeared in the Mercury News
In 2018, Facebook CEO Mark Zuckerberg announced that the company would appoint a content oversight board to act as the final decision maker when it comes to content moderation decisions. Like the U.S. Supreme Court, the board wouldn’t look at all decisions, but a small selection that rise to the top, perhaps because they affect a great many people or because they are particularly thorny. Also, like the Supreme Court, the board’s decisions are meant to be final. Even Zuckerberg can’t override the board.
On Wednesday, the newly formed oversight board announced its first 20 members and briefed the press on how it plans to operate. The roster of members reads like a Who’s Who of international luminaries, including the former (and first woman) Prime Minister of Denmark, Helle Thorning-Schmidt. Other members include Nobel Peace Prize laureate Tawakkol Karman from Yemen; Pakistan’s Nighat Dad, founder, Digital Rights Foundation; several law professors and other high level individuals from around the world. You can learn more about the board and its members at OversightBoard.com.
At the press briefing, Facebook’s policy director Brent Harris said “Facebook will implement the board’s decisions unless doing so violates the law.” Thorning-Schmidt, one of four board co-chairs, said “some of the most difficult decisions around content have been made by Facebook, and you could say ultimately by Mark Zuckerberg. And that’s why I feel that this is a huge step for the global community that Facebook has decided to change that with the oversight board. We will now for the first time have an independent body, which will make final and binding decisions on what constant stays up and what constant is removed.”
One could argue that creating a board like this to handle moderator decisions appeals is overkill, but if you look at the impact of Facebook’s content decisions on how people make important decisions, it starts to make sense. Facebook’s role in helping to spread misinformation during the 2016 election is well known. But even as the company strives to rid itself of fake or misleading information, it encounters obstacles, including from politicians and elected officials who feel that Facebook is making decisions based on ideology and political preferences rather than on a strict set of rules that apply to people of all persuasions. The company has tried arguing that its decisions are based on behavior and that it’s not putting its thumb on the political scales, but that hasn’t quieted its critics who are convinced that Facebook, along with Google and to some extent Apple and Amazon, are biased. Most of the criticism comes from conservatives, but I’ve seen it coming from the left as well.
On its website, the oversight board argues that “freedom of expression is a fundamental human right … but there are times when speech can be at odds with authenticity, safety, privacy, and dignity. Some expression can endanger other people’s ability to express themselves freely.” The board will make content decisions for both Facebook and its subsidiary Instagram.
Relevant to pandemic
Although the board was announced long before the COVID-19 pandemic, it’s relevant in that the pandemic has brought out a lot of people using Facebook to promote false cures, sell shoddy products and promote false information about the nature of the virus. Facebook has been pretty aggressive about taking down what it considers to be dangerous false information, but what one person considers dangerous, another person might consider to be a worthwhile cure. Right now Facebook is on its own when it comes to establishing and enforcing policies regarding dangerous content, but now it has a board to advise it on policy and make final decisions on content for those cases selected for review.
The board can also make policy recommendations but Facebook will continue to set its own policies. Facebook has provided the board with a $130 million operating budget which it says can not be revoked. Members are compensated but board officers did not disclose the amount.
I don’t blame you if you’re skeptical. After all, this board was set up and funded by Facebook, which has a lot of work to do to protect its reputation given all of the problems related to inappropriate content and accusations of bias. It’s also clear that Zuckerberg has been on the defensive in his dealings with Congress and other elected officials.
But, after listening to several of the members of this new board and reading through its charter, I’m willing to give the board and Facebook the benefit of the doubt.
It won’t be perfect. Mistakes will be made, some important content decisions that should be reviewed probably won’t be, because the board can’t possibly review all the cases. Decisions will be made that many people disagree with. In that sense, it’s no different from courts that have to weigh competing values, sometimes unclear facts and come to a decision that some people will not like.
But one thing I like about this board is that it will be able to make its decisions without worrying about stockholder approval, the bottom line or the ire of Mark Zuckerberg who will no doubt agree with some of its decisions and disagree with others. I also like the fact that the board says it will be transparent and will publish its decisions so that the public knows what it considered.
As CEO of ConnectSafely.org and a member of safety advisory boards at Facebook, Snapchat, Twitter, Roblox and other companies, I’ve learned that content decisions are often nuanced and not always tied to the company’s business interests. Sometimes there are competing rights like the right of free speech versus the right not to be harassed or bullied or having content that is highly disturbing, racist, sexist or otherwise hateful. These are often difficult decisions that require serious deliberation, and I’m pleased to know that there is a board of people who are equipped to do that hard work.
Disclosure: Larry Magid is CEO of ConnectSafely.org, a non-profit internet safety organization that has received support from Facebook and other technology companies.