NEWS

David Cameron’s King Canute moment
The Prime Minister's touching belief that he can clean up the web with technology is misguided and even dangerous, says Padraig Reidy
22 Jul 13

king-canute-cameron

The Prime Minister’s touching belief that he can clean up the web with technology is misguided and even dangerous, says Padraig Reidy

Announcing plans to clean up the internet on Monday morning, David Cameron invoked King Canute, saying he had been warned “You can as easily legislate what happens on the Internet as you can legislate the tides.”

The story of Canute and the sea is that the king wanted to demonstrate his own fallability to fawning fans. But David Cameron today seems to want to tell the world that he can actually eliminate everything that’s bad from the web. Hence we had “rape porn”, child abuse images, extreme pornography and the issue of what children view online all lumped together in the one speech. All will be solved, and soon, through the miracle of technology.

Cameron points out that “the Internet is not a sideline to ‘real life’ or an escape from ‘real life’; it is real life.” In this much he’s right. But he then goes on to discuss the challenge of child abuse and rape images in almost entirely technological terms.

I’ve written before about the cyber-utopianism inherent in the arguments of many who are pro filtering and blocking: there is an absolute faith in the ability of technology to tackle deep moral and ethical issues; witness Cameron’s imploring today, telling ISPs to “set their greatest minds” to creating perfect filters. Not philosophers, mind, but programmers.

Thus, as with so many discussions on the web, the idea that if something is technologically possible, then there is no reason not to do it, prevails. It’s simply a matter of writing the right code rather than thinking about the real implications of what one is doing. This was the same thinking that led to Cameron’s suggestion of curbs on social media during the riots of 2011.

The Prime Minister announced that, among other things, internet service providers will be forced to provide default filters blocking sites. This is a problem both on a theoretical and practical level; theoretically as it sets up a censored web as a standard, and practically because filters are imperfect, and block much more than they are intended to. Meanwhile, tech-savvy teenagers may well be able to circumvent them, meaning parents are left with a false sense of security.

The element of choice and here is key; parents should actively choose a filter, knowing what that entails, rather than passively accepting, as currently proposed by the Prime Minister. Engaging with that initial thought about what is viewed in your house could lead to greater engagement and discussion about children’s web use – which is the best way to protect them.

It is proposed that a blacklist of search terms be created. As Open Rights Group points out, it will simply mean new terms will be thought up, resulting in an endless cat and mouse game, and also a threat of legitimate content being blocked. What about, say, academic studies into porn? Or violence against women? Or, say, essays on Nabokov’s Lolita?

Again, there is far too much faith in the algorithm, and far too little thinking about the core issue: tracking down and prosecuting the creators of abuse images. The one solid proposal on this front is the creation of a central secure database of illegal images from which police can work, though the prime minister’s suggestion that it will “enable the industry to use the digital hash tags from the database” does not fill one with confidence that he is entirely across this issue.
The vast majority of trade in abuse images comes on darknets and through criminal networks, not through simple browser searches. This is fairly easily proved when one, to use the Prime Minister’s example, searches for “child sex” on Google. Unsurprisingly, one is not immediately bombarded with page after page of illegal child abuse images.

As Daily Telegraph tech blogger Mic Wright writes: “The unpleasant fact is that the majority of child sexual abuse online is perpetrated beyond even the all-seeing eye of Google.”

The impulses to get rid of images of abuse, and shield children from pornography, are not bad ones. But to imagine that this can be done solely by algorithms creating filters, blacklists and blocking, rather than solid support for police work on abuse images, and proper, engaged debate on the moral and ethical issues of what we and our children can and cannot view online, really is like imagining one can command the tides.

By Padraig Reidy

Padraig Reidy is the editor of Little Atoms and a columnist for Index on Censorship. He has also written for The Observer, The Guardian, and The Irish Times.

READ MORE

CAMPAIGNS

SUBSCRIBE