If you send nudes to Facebook for revenge porn protection, the photos will be checked by humans

Facebook caused more than a little eye-rolling with its anti-revenge porn program which requires users to upload the naked images of themselves that they would like to protect. It had been assumed that the entire process would be automated, making use purely of algorithms to analyze images and protect privacy. This assumption was wrong.

Facebook says that in actual fact an employee -- an actual human being -- will have to review the nude images that are sent in.

See also:

A Facebook spokesperson confirmed that any intimate photos that are uploaded in an attempt to gain protection against revenge porn will have to be manually checked by human eyes to determine whether they do in fact fall into the category of revenge porn.

While this certainly eliminates the potential for the new system to be abused to block the sharing of other images, it's a process that few people are likely to be happy with, and one that is likely to dissuade revenge porn victims from using the service.

The employee would be part of Facebook's community operations team, and would be "specially trained" to view such uncensored images. Facebook has neither confirmed nor denied suggestions that blurred versions of images would be retained after image hashes had been created.

Image credit: Africa Studio / Shutterstock

© 1998-2017 BetaNews, Inc. All Rights Reserved. Privacy Policy.