Share this...

 


ConnectSafely’s interview with Facebook head of global safety, Antigone Davis

As Facebook’s head of global safety, Antigone Davis helps develop and enforce guidelines that govern how more than 2.4 billion people in the world use the company’s platforms. And she has to juggle competing demands that don’t just include how to balance safety with the company’s need to make money but ways that different safety issues sometimes compete with each other. For example, just about everyone agrees that child protection is paramount, which is why Facebook and most other tech companies have robust policies to combat child exploitation. But there is also general agreement that users have a right to privacy and security, which is why Facebook is also planning to encrypt user data on nearly all of its platforms. Since safety advocates and law enforcement personnel worry that encryption will let child predators and other criminals avoid detections, while privacy and security advocates demand encryption to protect confidential data, which makes it harder (though Davis says not impossible) for law enforcement to investigate crimes.

Free speech also butts into safety, as we’ve seen with the battle over whether to allow people to post unverified claims about COVID-19 or election results. Some say that Facebook’s efforts to police that content violates their free speech rights, while others criticize Facebook for not doing enough to protect its users from dangerous false information.

Even issues like whether it’s OK to post graphic violence can be controversial. Facebook generally prohibits it, but there are exceptions based on the motive of the person posting. Glorifying or celebrating violence is always prohibited but it might be allowed if it’s part of a campaign to condemn that violence. That’s one of the reasons Facebook has recently convened its independent content review board that operates as a sort of “Supreme Court” for a handful of very difficult decisions. Still, enforcement and policy decisions remain with company executives like Davis.

Our interview

As part of ConnectSafely’s 2021 Safer Internet Day commemoration, we spoke with  Davis via Zoom, starting with Facebook’s general approach to safety.

“We really try to look at it from a 360-degree view,” said Davis. “So we look first of all, at our policies, do we have the right policies in place for what people can and cannot share? Do we have the right tools in place, both the tools that you might see as a user default privacy settings, etc. but also the tools that might be running in the background, such as machine learning or AI?”

And even with thousands of moderators around the world, she also struggles with “Do we have the right experts working at Facebook to get the right information to actually do a good job?” Many people complain about Facebook’s privacy policies but don’t understand the extent to which they can control who can see what. That may be because those controls are difficult to understand or because Facebook hasn’t done enough to explain them, but it’s important for people to understand how they can limit who sees what. Of course, Facebook does use personal information to display targeted advertising but says that it does not share that information with advertisers. Davis recommends people review their privacy settings regularly. It’s not a “set and forget” one-time event “because, at one point in time, you may have wanted to share more than you may at another point in time depending on circumstantial changes,” said Davis.

I recommend that people go into your settings on a regular basis. Because at one point in time, you may have wanted to share more than you may at another point in time depending on circumstantial changes. (Antigone Davis)

Misinformation

Misinformation has become a major problem around the world. It was very much on people’s minds before and after the 2020 U.S. presidential election. It’s been a major issue since the advent of COVID-19, with some people sharing dangerous false information about so-called “cures” or unfounded claims designed to discourage people from taking precautions or being vaccinated. Davis said that Facebook has “tight health misinformation policies and we’ll either remove false claims or fact check claims that reduce their visibility.” She said, “the goal for us is to actually get people to accurate information when they’re looking for it.” To that end,” we have a COVID information hub, where people can go to find accurate information, can find out information about where to go further to get vaccines, the latest, the latest protocols, etc.”

Globally, Facebook is working “with ministries of health around the world to ensure that people have access to good information because it’s very important in a period like this.”

Hate speech or free speech?

Although many say it’s not doing enough, Facebook has made an effort to cut down on what is commonly considered hate speech. But there isn’t universal agreement over what and who they should ban. For example, some politicians have accused Facebook of censorship, especially after they suspended the account of then-President Donald Trump in the aftermath of the January 6th invasion of the Capitol. Davis is clear that “we don’t allow hate speech” but admits that the issue can be nuanced. “There are also areas where it can be very complicated … not so much in the context of allowing hate speech, which we don’t allow, but more in the context, for example, when you have political figures, and there’s a certain amount of freedom of expression that you want to actually allow to occur so that people can make decisions about those political leaders, so they can hear the information, they can assess who they are, and they can actually respond by voting.” But she reaffirmed, “We don’t allow hate speech. We aren’t going to allow the incitement of violence. And we’re going to take action against that kind of content.” Even, as they demonstrated in January, if it comes from the world’s most powerful person.

Child sex abuse images

Like all reputable sites, Facebook does not allow so-called “child pornography” on its platform. “How we go about preventing it on our platform takes a number of different techniques,” including “photo-matching technologies to identify potentially known material of this kind, as well as technologies to identify new material that may be produced and shared on our platform.”

That’s an important distinction. While all child sexual abuse images are horrific, reports of new materials demand urgent attention by law enforcement because they could indicate a child currently being abused. In the U.S., all instances of this illegal content must be reported to the National Center for Missing & Exploited Children’s (NCMEC) CyberTIpLine and, in 2019, Facebook reported 18 million images. But Davis cautions that number can be misleading. She said that “90% of the content that we reported to NCMEC last year (based on October and November 2020 reporting) actually was images that had previously been reported that our technology was detecting before users reported it to us,” In other words, many of the same images are shared over and over again which, itself, is a violation of the victim because each share re-victimizes the person in the picture, even years later. Facebook, she said, is working on ways to reduce that sharing, “because anytime this content is shared, it victimizes someone.”

Positive ending

As is often the case with an interview with a safety official, much of our conversation was on distressing topics. But — as Facebook users ourselves — we know that the vast majority of what happens on all reputable social media platforms is positive, So we asked Davis about her favorite use of social media.

Oh, goodness, actually, it’s a really simple answer. My favorite use of social media is staying in touch with my daughter. She’s 23. She lives in New York City at this point, which means I don’t get to see her every day. I don’t get to tell her to clean up her room every day. And I actually enjoy using social media both to watch the work that she’s doing and to stay in touch with her through things like messaging.


Share this...