Facebook Is Hiring 3,000 People to Help Prevent Suicide and Murder Videos From Being Shared

After a string of disturbing incidents

Facebook is hiring 3,000 more people to monitor violent videos on the platform.
Getty Images

Facebook is hiring 3,000 more people for its community operations team, which reviews sensitive material to keep violence, hate speech and child exploitation off the platform. The move brings the division up to 7,500 employees.

In response to a string of disturbing videos that have surfaced on the social network in recent weeks—such as a Cleveland man’s murder, a teenager who accidentally shot himself while broadcasting on Instagram Live Stories, and the killing of an 11-month-old girl in Thailand—Facebook CEO Mark Zuckerberg said the company is expanding its global community operations team.

In a Facebook post today, Zuckerberg said employees will be tasked with reviewing the “millions of reports” the platform receives every week. He said reviewers will help the company more quickly remove content that violates Facebook policies while working with local law enforcement to respond when needed.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook—either live or in video posted later,” Zuckerberg wrote. “It’s heartbreaking, and I’ve been reflecting on how we can do better for our community. If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down.”

The company—which had been criticized in the past for not doing enough to prevent videos from being posted—has been working to improve its monitoring tools. Along with hiring more employees, Facebook is also making it easier and faster to report posts that violate the platform’s policies. Last month, it announced it’s exploring the use of machine learning to prevent offensive videos from being shared.

According to Zuckerberg, a report last week notified Facebook that someone was considering suicide during a livestream broadcast. The company contacted law enforcement, which intervened.

“No one should be in this situation in the first place,” he wrote. “But if they are, then we should build a safe community that gets them the help they need.”

Meanwhile, Facebook’s first quarter earnings will be reported later today.

The move also comes during the first week of the Digital Content NewFronts in New York, where the topic of livestreaming seems to be taking a backseat to brand safety compared to last year.

Recommended articles