We want to wipe out child porn online, but can it actually be policed?

Filtering web content is always something of a sticky topic, and there are two very vocal sides to the argument. In the blue corner (red and blue are not being used in a political sense here -- if only because the blue represents right-wing parties in the UK and the more left-leaning in the US) we have those who advocate freedom of speech online, the right for anyone to say whatever he or she wants. In the red corner are those who feel that there is a need for policing, control and regulation.

It is certainly a difficult balance to strike. It would be all but impossible to draw a baseline level of sensibilities that should be catered to -- whatever form of control may be put in place will be seen as draconian in its severity by some, and pathetically lenient by others.

There are many subjects that are open to debate. Freedom of religious expression means that everyone is free to not only express, but also support and promote their beliefs. This is true even when different religions are at odds, and the same rings true for political beliefs. The western world looks on almost in a state of disbelief at what passes for the internet in China.

ISPs around the world are under constant pressure from various sides to censor, filter and police what their customers are able to access. Copyrighted material is removed from websites quite frequently, but it does very little to combat piracy. Attempting to stop anything on the internet is a virtually impossible task. The infrastructure is already in place so that any restrictions that are introduced are easily bypassed -- unless a country does take the extreme step of setting up its own Great Firewall of China.

As most of the world is (fairly) democratic, few countries would like to go this far; so it is left to ISPs and web hosts to manage things. In the UK, Prime Minister David Cameron is calling on internet companies such as Facebook and Google to do more to tackle online pornography. Google has already offered to invest heavily in cleaning up porn and developing a hashing technique that will make it easier to track images as they spread across the web.

But who should be held responsible for content that is deemed unsuitable or illegal? The person who posted it? There are different laws in different countries to consider. The hosting company which essentially makes the content available? This would be a gargantuan and prohibitively expensive operation, particularly for smaller companies. How about ISPs for ultimately delivering content to individual computers? You may as well take things further and start suing monitor manufacturers for providing a technology that could be used to display all manner of illegal content.

Then there is the Scunthorpe problem to consider. For the uninitiated, Scunthorpe is an English town. Nothing extraordinary in this, but the name does include a four-letter word that has fallen foul of many an over enthusiastic obscene word filter.

This is obviously not something that relates directly to the issue of porn, but it does raise the issue of context. Nudity is not always porn, nor indeed are images of children, even naked children, something that should be viewed as 'wrong'. How many photos do your parents have of you as a children? Hundreds, no doubt.

I'm willing to bet hard cash that a proportion of these feature you in some state of undress -- mum and dad always take snaps of their kids in the bath, cavorting naked in the garden, or skinny dipping at the beach. These are not porn, are they? Upon bringing a new partner to meet your parents, these photos may be pulled out as a bit of fun. How different is this -- really -- to sharing the same images on Facebook?

Well, it probably wouldn’t last long on Facebook as virtually all nudity is quickly stamped out, but the same photos could be shared on a personal website and little would be thought of it. It is about context. A photo of a child in the bath included in a gallery of childhood photos is quite innocent; include the same image in a gallery that also features more illicit images and it's a different matter entirely.

But things aren’t always that black and white. The notion of context is something that is quite subjective. An image that is completely innocent, pure and savoury to the majority of people could be something quite titillating to someone else.

Of course, there are organized child porn rings involved in terrible cases of abuse, and I want in no way to diminish the seriousness of such crimes. Governments are keen to be seen to be doing something about anything that stirs up strong emotions in the electorate.

It is very easy for David Cameron to call for 'something' to be done, but in merely making this demand he -- and others like him -- are demonstrating that they are out of touch and really don't understand what is involve in monitoring web content. Something we learned from the NSA  debacle is that so much data is being collected that it is impossible to do anything with most of it, so you're back to relying on content being reported, or randomly picking on websites to see if there is objectionable content.

There is no easy solution. Politicians calling on companies to police something as inherently unpoliceable as the internet amounts to little more than passing the buck. It seems there is no perfect answer.

A policed web is something that should be eyed with suspicion, while something entirely unmanaged runs the risk of becoming anarchic -- although the same could be said of conversations, or letters. Nothing that falls between these two extremes is going to please everyone, but those in power need to be seen to be doing something meaningful rather than using emotive subjects to score points.

Photo Credit: InnervisionArt/Shutterstock

9 Responses to We want to wipe out child porn online, but can it actually be policed?

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.