A select committee has called for more regulation and greater safety on the Internet. But politicians should be careful what they wish for, says Bill Thompson
It would be nice to think that the latest call to ‘do something’ about online content from the Culture, Media and Sport Select Committee was grounded in some new development that made it trivial for websites to identify adult-oriented content, an online identity system which reliably linked social network profiles with age verification for all users, or the release of a user-friendly but unbreakable watermarking scheme that could identify copyrighted material whenever it appeared on an Internet-accessible computer.
Because the alternative would be that a bunch of MPs has decided the best way to get some publicity at the start of the summer recess, when newspaper editors are starved of ‘serious’ stories, is to announce that the Internet is like the Wild West, and children are constantly exposed to unsuitable material on YouTube, reveal intimate personal details on Bebo and surf the web looking for pro-anorexia or suicide support sites.
Sadly, it seems that John Whittingdale and his committee members have not been poring over the technical details of IPv6 and OpenID, so what we’ve got in their report is yet more condemnation of the dark side of today’s Internet and a few poorly-grounded suggestions as to what might be done, most of which seem to comprise a call for Internet service providers and web hosts to become the net’s new morality police.
There’s a token nod towards the importance of media literacy, but what they really want is self-regulation, so that, for example, user-generated content that might be a little bit contentious will be automatically flagged and reviewed before it appears. Perhaps YouTube will add news reports of technologically illiterate politicians trying to sanitise the Internet to their list of harmful content. They certainly raise my blood pressure to dangerous levels.
The tactic here is one we have seen many times before. In the past it was usually applied to tabloid newspapers, who would be warned by the responsible minister that they were ‘drinking in the last chance saloon’ and that some sort of press regulation was imminent if they failed to ‘clean up their act’.
Newspapers are remarkably easy to intimidate as they tend to be owned by large companies with self-important proprietors who like the access to politicians that accompanies old-style media power. They have assets that can be threatened, employees who can be arrested and means of distribution that can be blocked by governments in extreme situations.
At first sight online seems different, but in fact controlling the net is not as hard as it seems. As Tim Wu and Jack Goldsmith pointed out in their excellent book Who Controls the Internet, the point is not that the network cannot be regulated, but that it can be difficult for governments to figure out where to apply the pressure. At the moment the ISPs are the main target, as they are easy to find in the phone book and have businesses with premises, equipment and staff, but social network sites are clearly in the frame too.
It is also possible for governments to change the way the network operates. This is a little harder to pull off, but we should not act as if the nature of the Internet is unchanged and fixed, laid down on tablets of silicon by the network gods back in the old days when the tribe of geeks was wandering in the desert of proprietary protocols.
In his book Code (and in Code 2.0, the crowdsourced second edition) Lawrence Lessig articulated the core principle that we all need to remember: code is law. Change the code and you change the law: the program that lets your computer talk to the Internet was written one way, but it can be written differently.
If we really wanted every email to include a photograph of the person sending it then we could rewrite the standard and change the code, and if we really wanted to link email addresses to entries in the national ID database we could.
The real danger is not that politicians ask for things that the current network architecture cannot support, but that the network could be changed to make those things possible.
YouTube and Flickr and Photobucket and MySpace could, if they choose, change their code to impose the sort of prior moderation of uploaded material we already see on many comment pages and blogs. Bebo could require age verification before it let you have an account, and a verified address before you can post material so that the authorities can track you down if you do something deemed inappropriate.
There is a dark side to the Internet, and there are real dangers for children who do not understand where they are looking or who is reading what they write. But the real danger of throwing out too many poorly-crafted ideas about how to censor, limit and control the network is that some people might decide to take them seriously, and they could severely limit the creative and business potential of the network without actually succeeding in protecting the vulnerable.
Read the Culture, Media and Sport Select Committee report here