Facebook to hire just 3,000 more people to moderate the content of its 1.9 billion users

offensive-content-buttons

There's been a lot of disturbing content on Facebook in recent weeks and months -- murders, rapes, assaults, shootings and more. After a fatal shooting was shared on the site, Facebook vowed to do more to combat this sort of material.

As the social network announced that it now has 1.9 billion users, it also announced that it is to employ an extra 3,000 people to help moderate content. This brings the company's total number of moderators to 7,500. Can this possibly be enough to manage the posts of nearly 2 billion people, and who on earth would want to be tasked with viewing some of the most gruesome content to determine whether or not it should be removed?

Facebook's team of 7,500 moderators has the unenviable job of monitoring the postings of a quarter of the world's population. The social network has faced a great deal of criticism for reacting too slowly to requests to remove offensive videos and other content, but is it really expecting to make much of an improvement with such a small number of people?

Announcing the new jobs, Mark Zuckerberg said:

Over the next year, we'll be adding 3,000 people to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly.

If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that’s responding quickly when someone needs help or taking a post down.

By Zuckerberg's own admission, there are "millions of reports" to deal with each week. That’s a hell of a workload for what is, really, a pretty small team. And that's before you consider the psychological impact that having to manually intervene is going to have on the individuals concerned -- after all, it is impossible to determine whether a video contravenes laws or Facebook policies without actually watching it.

Of course, some of the content that's reported is fairly harmless. Facebook does not allow, for instance, images of nudity to be posted, and few moderators would suggest they'd be scarred for life in having to check reports of such content. But when it comes to reports about sexual abuse, murder, suicide and the like, it's a very different story. Seeing one video of a real human being getting beheaded is disturbing, seeing multiple instances of it is going to have long term repercussions.

There is a massive amount of highly brutal material available online, and a portion of it ends up on Facebook. Having to review this material is going to be hard work. Even for people who feel they can handle watching violent movies, being subjected to footage of real people in the real-world suffering violence and brutality is going to have a lasting effect.

Facebook is desperate to be seen to be doing the right thing when it comes to keeping its network free from content that might break the law or cause extreme offense. What's not clear, however, is whether the company takes its responsibility to its own employees as seriously.

Image credit: hafakot / Shutterstock

6 Responses to Facebook to hire just 3,000 more people to moderate the content of its 1.9 billion users

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.