After some horrific events on Facebook Live, including suicides and reports of a murder, Facebook is adding another 3,000 people to its worldwide support staff. The company already had 4,500 people responding to user reports.
In a Facebook post, CEO Mark Zuckerberg said, “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
He added that the reviewers will also help Facebook remove content that shouldn’t be on the service such as hate speech and child exploitation. He said that the company will make it simpler for users to report problems and that “we’re also building better tools to keep our community safe.” He didn’t specify the tools but it’s been reported that Facebook engineers are using artificial intelligence to help identify and remove objectionable content.
“Just last week,” wrote Zuckerberg, “we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”