Index relies entirely on the support of donors and readers to do its work.
Help us keep amplifying censored voices today.
It wasn’t meant to be like this.
Connoisseurs of a good political bust-up may have noticed a subtle change in tempo to the online filtering debate over the Christmas period. For the argument, so long owned — in public at least — by the pro-blocking “think of the children” lobby took a sudden and unexpected twist. For a moment, the villains were not selfish libertarians, determined to place personal freedom of expression above child protection — but the incompetents in government, who had demanded a solution that was untested without first ensuring they weren’t doing more harm than good.
What went wrong?
As German military strategist Helmuth von Moltke, in the news during this World War anniversary year, once put it: “no plan of operations extends with any certainty beyond the first contact with the main hostile force”.
It was always going to be an easy win, banging on about the need to protect children and threatening internet service providers with legislation if they didn’t comply with prime ministerial demands over filtering: easy, too, to dismiss the assorted nerds and geeks who warned it wouldn’t work. As a prime ministerial adviser on this topic, Claire Perry, MP put it: “We should not allow the perfect to drive out the good”.
But since November, filters have arrived with a vengeance and even the technologically naive can see that they don’t exactly work as claimed. A BBC expose in December revealed what was always expected: They over blocked some quite useful sites, including sites dealing with LGBTI issues, sex education and even domestic violence and rape, while simultaneously under blocking a lot of porn.
“Not us, guv,” explained a spokesperson for Number 10. Back in July 2013, David Cameron had very presciently blocked all possible blame by requesting the UK Council for Child Internet Safety (UKCCIS) make sure this sort of thing didn’t happen. UKCCIS, a body composed approximately 50% from those with a commercial interest in this area, set up a sub-committee which met in December 2013, some weeks after the first of the new filter solutions hit parental laptops. No minutes, though: open government has been filtered!
Meanwhile, much wordage was being unleashed in the minority and progressive press. Rather like stories about government losing databases a few years back: No sooner had the press happened on one instance of ridiculous blocking, then another even more ridiculous case joined the queue.
In the circumstances, to list the legion legitimate sites that were in one way or another blocked would be tedious. So let’s stick with some of the most serious. BT, it transpired, was offering parents the opportunity to bar access to LGBT material — almost certainly direct discrimination — as well as access to social support.
In other words, if you are a child abuser or perpetrator of domestic violence, just go BT — and you can shut off one avenue to support for your victims. The filter is still available — though BT have tactfully amended the marketing description.
Two firms — Trend Micro and Dell — were also found to be selling a tool that permitted explicit blocking of LGBTI content: both, following exposure by GayStarNews, have subsequently amended their product.
It is unlikely that opinion swung irretrievably against blocking and filtering, but it is clear that the public, now aware of what those techniques mean in practice are suddenly a lot less impressed by political demands for UK providers to censor their net habits.
Along the way came something of an own goal. The latest initiative relates only to filtering and blocking of internet access through pc portals. Mobile phones have been subject to a filtering regime — largely unnoticed — since 2004, while filtering of wifi in public spaces is up for debate in 2014. The bad news, for the former, was that suddenly their activities were up for the same level of criticism as internet service providers, while discussion of the latter may no longer be quite the slam dunk that government had hoped.
So much for the panic: What about solutions? At a parliamentary meeting last week, sponsored by Julian Huppert, MP and organised by the author of this article, Jane Fae, a wide range of groups came together to discuss the issues raised. That included the usual suspects — the minorities on the sharp end of blocking — as well as representatives from industry and members from other parliamentary parties.
The problems raised here were rehashed, but the real focus was on the future, and there was little comfort for advocates of filtering. Speakers talked about taking legal action against filter companies, both in respect to discrimination and, for compensation when, as happened to one businesswoman, they find their business website blocked for no other reason than that she is transgender.
The difficulty is that government ministers have continually harped on about the Internet Watch Foundation as model solution, while blithely ignoring the fact that they also happen to be a Rolls Royce solution: Sites are individually evaluated by individual moderators. This costs serious money.
However, while this is an issue so important that government has threatened legislation if service providers don’t play ball, government — and the public — seem remarkably unwilling to stump up the many millions that would be required to come close to even a partial fix. So service providers have done what they can, reaching out to solution providers such as Nominum, Symantec and Huawei — all non-UK companies — operating a range of different filtering systems behind the veil of “commercial confidentiality” and not subject to UK law.
There is no single central service to check if a website has been wrongly filtered — even by the government’s own criteria — no central process for removing a potentially ruinous misblock. It’s the cheap option: A bit like the government deciding child protection in the UK was so important, it should be sub-contracted to a bunch of unregulated freelance social work providers.
Is regulation the answer? That was suggested, along with licensing of filter solutions and an independent audit of same. That, however, attracted little support in the meeting, being rejected both by those opposed to all filtering, and by those who felt it would create a costly and bureaucratic quango.
At the same time there was somewhat more appetite for central reporting facilities and a central appeals process. Because, how is any legitimate business supposed to conduct itself if it needs to keep a constant eye on upwards of 80 different filtering companies?
What of future debate? Ironically, the day of the meeting, Ofcom was also publishing a report that suggested parents were mostly happy with matters as they were. Government, on the other hand, intends forcing all net users to decide whether to opt out of filtering later this year. Ofcom also pointed out — as experts already did — that children, the objects of all this protection, were becoming increasingly net savvy, with significant numbers knowing how to evade filtering and cover their browsing tracks.
That is a serious issue. It is likely that children in war zones such as Syria or central Africa will have significantly more knowledge of how to use guns than the average British child. Its all about exposure. Whereas Britain, by imposing all these controls, is growing a generation that knows how to evade internet control. From there, it is but a short step to the darknet, where lurks precisely the sort of criminality that government — again — says it wants to eradicate.
We are likely to hear more about the commercial interests involved in all this. For there is a growing realisation that many of the more startling statistics and internet horror stories are produced and disseminated by companies offering filtering solutions and American evangelist organisations: Sometimes one and the same.
It is to be hoped, too, that the media and politicians will be more critical of some of the wilder statistics being tossed around in debate. Take for instance, the incidence of children viewing porn on the internet. “The average child sees their first porn by the age of just 11. Between 60 and 90 per cent of under-16s have viewed hardcore online pornography” — according to a survey carried out in 2010 by Psychologies magazine, based on the views (no numbers cited) of 14 to 16-year-olds at a north London secondary school.
As opposed to the EU Kids Online survey of over 25,000 children in 25 countries that found just 11% of UK children had viewed any form of porn online in the previous 12 months.
Which would you believe? Which do you expect to be cited approvingly — and frequently — in the tabloid press?
So where are we now? Battle has at last been joined, and finally the public can see that there are major practical problems associated with online filtering. That hasn’t, yet, diminished the appetite of the Conservative party for more of the same. Nor has it dissuaded the Labour party from jumping aboard the same bandwagon.
Meanwhile, in an act that smacks of the politics of masochism, Labour appears to have pledged that if voluntary filtering fails, then, if elected in 2015, it will legislate to introduce mandatory filters
This one, it seems, will run and run.
This article was published on 24 January 2014 at indexoncensorship.org
The Prime Minister’s touching belief that he can clean up the web with technology is misguided and even dangerous, says Padraig Reidy
Announcing plans to clean up the internet on Monday morning, David Cameron invoked King Canute, saying he had been warned “You can as easily legislate what happens on the Internet as you can legislate the tides.”
The story of Canute and the sea is that the king wanted to demonstrate his own fallability to fawning fans. But David Cameron today seems to want to tell the world that he can actually eliminate everything that’s bad from the web. Hence we had “rape porn”, child abuse images, extreme pornography and the issue of what children view online all lumped together in the one speech. All will be solved, and soon, through the miracle of technology.
Cameron points out that “the Internet is not a sideline to ‘real life’ or an escape from ‘real life’; it is real life.” In this much he’s right. But he then goes on to discuss the challenge of child abuse and rape images in almost entirely technological terms.
I’ve written before about the cyber-utopianism inherent in the arguments of many who are pro filtering and blocking: there is an absolute faith in the ability of technology to tackle deep moral and ethical issues; witness Cameron’s imploring today, telling ISPs to “set their greatest minds” to creating perfect filters. Not philosophers, mind, but programmers.
Thus, as with so many discussions on the web, the idea that if something is technologically possible, then there is no reason not to do it, prevails. It’s simply a matter of writing the right code rather than thinking about the real implications of what one is doing. This was the same thinking that led to Cameron’s suggestion of curbs on social media during the riots of 2011.
The Prime Minister announced that, among other things, internet service providers will be forced to provide default filters blocking sites. This is a problem both on a theoretical and practical level; theoretically as it sets up a censored web as a standard, and practically because filters are imperfect, and block much more than they are intended to. Meanwhile, tech-savvy teenagers may well be able to circumvent them, meaning parents are left with a false sense of security.
The element of choice and here is key; parents should actively choose a filter, knowing what that entails, rather than passively accepting, as currently proposed by the Prime Minister. Engaging with that initial thought about what is viewed in your house could lead to greater engagement and discussion about children’s web use – which is the best way to protect them.
It is proposed that a blacklist of search terms be created. As Open Rights Group points out, it will simply mean new terms will be thought up, resulting in an endless cat and mouse game, and also a threat of legitimate content being blocked. What about, say, academic studies into porn? Or violence against women? Or, say, essays on Nabokov’s Lolita?
Again, there is far too much faith in the algorithm, and far too little thinking about the core issue: tracking down and prosecuting the creators of abuse images. The one solid proposal on this front is the creation of a central secure database of illegal images from which police can work, though the prime minister’s suggestion that it will “enable the industry to use the digital hash tags from the database” does not fill one with confidence that he is entirely across this issue.
The vast majority of trade in abuse images comes on darknets and through criminal networks, not through simple browser searches. This is fairly easily proved when one, to use the Prime Minister’s example, searches for “child sex” on Google. Unsurprisingly, one is not immediately bombarded with page after page of illegal child abuse images.
As Daily Telegraph tech blogger Mic Wright writes: “The unpleasant fact is that the majority of child sexual abuse online is perpetrated beyond even the all-seeing eye of Google.”
The impulses to get rid of images of abuse, and shield children from pornography, are not bad ones. But to imagine that this can be done solely by algorithms creating filters, blacklists and blocking, rather than solid support for police work on abuse images, and proper, engaged debate on the moral and ethical issues of what we and our children can and cannot view online, really is like imagining one can command the tides.
David Cameron has announced plans to block access to pornography online, with providers offering the choice to turn on a filter.
In a 2009 edition of Index on Censorship magazine Seth Finkelstein examines how indiscriminate blocking systems can be a source of censorship