Facebook is hiring a lot more humans to deal with its violence and harassment problem

Source: Getty Images
Source: Getty Images

Videos of suicide, violence and harassment have all recently been uploaded to Facebook, and the company has come under fire for struggling to review and remove the content with a sense of urgency.

On Wednesday, Facebook CEO Mark Zuckerberg announced that the company will be adding 3,000 people to its community operations team worldwide to better deal with explicit/sensitive content on the platform.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted later," Zuckerberg wrote in a Facebook post. "It's heartbreaking, and I've been reflecting on how we can do better for our community."

Zuckerberg noted that these 3,000 individuals will join the existing 4,500 people on the team.

"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation," Zuckerberg wrote. "And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they're about to harm themselves, or because they're in danger from someone else."

He also added that the company is "building better tools" in order to let users more easily flag issues online and to let reviewers more quickly figure out which reported content violates the company's standards and contact law enforcement where needed. While Zuckerberg didn't go into detail as to what these tools might look like, it's at least an acknowledgment from the company that they are not doing enough to prevent and remove content going against its community standards.

Adding thousands of humans to the team also signals Facebook's pivot from leaning on, and trusting, its algorithms to more efficiently flag and respond to inappropriate and violent content.

While it's reassuring to see Facebook trying to get a handle on this issue, the company should have foreseen the potential downsides of its live video streaming products ahead of releasing it, and had better tools and teams in place to deal with them before rolling the product out to millions of its users.

So when "Facebook says, 'Oh, we're trying to figure out ways to get a handle on [violent, abusive content],' that should be an unacceptable response," Mary Anne Franks, a professor at the University of Miami School of Law and the legislative and tech policy director of the Cyber Civil Rights Initiative, told Mic. "Because if they didn't have a handle on it before, they shouldn't have rolled out the product."

How likely are you to make Mic your go-to news source?

Melanie Ehrenkranz

Melanie is a writer covering technology and the future. She can be reached at melanie@mic.com.

MORE FROM

Meet the Girl Scouts that will earn badges for being cybersecurity experts

They'll soon get badges for coding, cryptography and more.

How to use the Snapchat Map while everyone else continues to be confused about it

Everything you need to know about the new feature.

Planet 10? Scientists may have discovered a hidden planet in our solar system

There could be a ninth — or even 10th — planet hiding out in our solar system.

Scientists created a robot that will iron your clothes for you

Shut up and take my money.

Moth eyes have inspired the touchscreen of the future

It's going to change the anti-reflection game.

Twitter was flagging tweets including the word "queer" as potentially "offensive content"

Why Twitter put the word "queer" in the same category as violent, sexual imagery.

Meet the Girl Scouts that will earn badges for being cybersecurity experts

They'll soon get badges for coding, cryptography and more.

How to use the Snapchat Map while everyone else continues to be confused about it

Everything you need to know about the new feature.

Planet 10? Scientists may have discovered a hidden planet in our solar system

There could be a ninth — or even 10th — planet hiding out in our solar system.

Scientists created a robot that will iron your clothes for you

Shut up and take my money.

Moth eyes have inspired the touchscreen of the future

It's going to change the anti-reflection game.

Twitter was flagging tweets including the word "queer" as potentially "offensive content"

Why Twitter put the word "queer" in the same category as violent, sexual imagery.