If you send nudes to Facebook for revenge porn protection, the photos will be checked by humans
Facebook caused more than a little eye-rolling with its anti-revenge porn program which requires users to upload the naked images of themselves that they would like to protect. It had been assumed that the entire process would be automated, making use purely of algorithms to analyze images and protect privacy. This assumption was wrong.
Facebook says that in actual fact an employee -- an actual human being -- will have to review the nude images that are sent in.
See also:
- Facebook: send nudes and we'll protect you against revenge porn
- Transparency: Facebook to force political ads to disclose funding sources
- Does Facebook listen in via your microphone to tailor your ads?
A Facebook spokesperson confirmed that any intimate photos that are uploaded in an attempt to gain protection against revenge porn will have to be manually checked by human eyes to determine whether they do in fact fall into the category of revenge porn.
While this certainly eliminates the potential for the new system to be abused to block the sharing of other images, it's a process that few people are likely to be happy with, and one that is likely to dissuade revenge porn victims from using the service.
The employee would be part of Facebook's community operations team, and would be "specially trained" to view such uncensored images. Facebook has neither confirmed nor denied suggestions that blurred versions of images would be retained after image hashes had been created.
There are algorithms that can be used to create a fingerprint of a photo/video that is resilient to simple transforms like resizing. I hate the term "hash" because it implies cryptographic properties that are orthogonal to how these fingerprints work.
— Alex Stamos (@alexstamos) November 8, 2017
Image credit: Africa Studio / Shutterstock