Facebook should be applauded for not blocking violent videos
There's a lot of crazy content out there. Social networks fill up with funny footage, music videos, informative clips, and the downright stupid. As with movies, there's also a good deal of graphic content out there too, and some of it finds its way onto Facebook.
This may be something you have already seen -- the rollout started quietly back in December -- but Facebook has now confirmed that it has started to place warning messages on videos that have been reported by users. Now, videos that are reported and subsequently deemed to be suitable only for an adult audience may not only have a warning message at the start, but such videos will not automatically play in people's timelines as others currently do.
Talking to the BBC, a Facebook spokesperson said:
We also ask that people warn their audience about what they are about to see if it includes graphic violence. In instances when people report graphic content to us that should include warnings or is not appropriate for people under the age of 18, we may add a warning for adults and prevent young people from viewing the content.
This is a remarkably enlightened move by Facebook, which very recently acquired video compression firm QuickFire. The social network is not unfamiliar with the notion of upsetting people -- as its Year In Review posts showed -- but the Charlie Hebdo shooting led Mark Zuckerberg to speak out in favor of freedom of speech. It's good to see that, within the realms of what has already been deemed legal, Facebook is allowing for something other than the black and white options of "allow everything" or "ban anything seen as offensive".
Now, users have an extra layer of protection. Those of a delicate disposition, the easily offended, the prudish, and the weak of constitution now have a way to avoid content that might upset them. Of course this isn't going to stop people from complaining, but Facebook should be applauded for taking this extra step to shield those who want to be shielded.
The Guardian points out that at the scheme currently only applies to videos that are violent, but as it is a work in progress, it's possible that the warning will roll out to other types of video content as well. Facebook has previously come under fire for removing images of mothers breastfeeding, but there is no reason that the same warning system could not be used in these instances.
Of course, it is not a perfect system. A warning is just a warning, and there will almost certainly continue to be complaints from people who "accidentally" click through, or from parents who are concerned that a warning message acts as a draw rather than a deterrent to their offspring. It's not perfect, but it's a start, and it's a lot better than simple censoring and sanitizing Facebook to cater for the lowest common denominator.