An imprecise and unwelcome art
Internet filtering, no matter how modern, serves the same purposes as censorship always has, says Egbert Dommering Filtering is the latest form of censorship. By filtering we mean the technical blockages of the free flow of information across the Internet that states put in place or require/persuade private institutions to institute. Filtering can be effectuated […]
28 Oct 08

amsterdam-20082Internet filtering, no matter how modern, serves the same purposes as censorship always has, says Egbert Dommering

Filtering is the latest form of censorship. By filtering we mean the technical blockages of the free flow of information across the Internet that states put in place or require/persuade private institutions to institute.

Filtering can be effectuated on different technical levels in the Internet. The most common practices are IP blocking, the so called DNS tampering and proxy blocking. IP blocking is based on blacklisting of the IP addresses (the number equivalents of the domain names) and can be applied in two ways. Routers can be instructed to drop all the packages that have the blacklisted IP address in the ‘header’ of the message.

As each host which is addressed by the IP address provides different services (hosting web sites and email servers), all services on the blocked IP address are no longer available. Moreover, one IP address may be in use by different domain names. All those domain names will be blocked.

The second form of IP filtering is inspection of content of the message on the basis of banned words. Routers only read the headers but disregard content, so extra equipment is needed. The complication with the package switching system of the IP protocol is that a message is split up in different packages, control becomes complicated and either misses the banned part or damages the whole message.

DNS tampering targets the Domain Name Servers, the databases that link domain names to IP addresses. Feeding the Domain Name Server with a list of banned names has the effect that the DNS system does not provide an IP address for the corresponding domain name. The request for a website perishes in nowhere land.

A proxy server is an intermediary that caches (temporally stores) web pages that are often demanded by the users, so that the making available of the demanded page becomes more efficient. The proxy needs to have the consent of the provider of the page and here blocking instructions can hit banned web pages. The proxy then will not give access to the banned page. Proxies that are known to host many forbidden pages can be eliminated altogether by redirecting all traffic aimed at these proxies to ‘clean’ proxies.

Besides filtering measures, there are many other ways to pester the life of ISP’s and information providers on the Internet, but I leave it here.

Why filtering? Filtering serves the same purposes as censorship has always served. Roughly speaking these are social, political and security goals. By ‘social’ we mean the forbidden fruits sex and gambling, and the sanctity of religion. By political: repression of dissident opinions and protection of the interests of the ruling political authority.

Since we have entered the area of terrorism, security is moving fast upwards the agenda of suppression. The world filtering map demonstrates the top ‘filtering countries’. Not all the countries practice the same degree of filtering. Some restrict themselves to one or two areas. Countries like China, Saudi Arabia, Iran and Pakistan are in the centre: they control the flow of information on Internet from the perspective of all three angles.

Chrina probably gets the gold medal for the depth and scope of its filtering practices. The organisation of the electronic flow of information in China facilitates this. On the technical level it can route the traffic to a few central servers, controlled by the government. Access is channelled through Internet service providers owned or controlled by the state. Internet cafés are closely monitored by means of a strict licensing system.

If we look at what I call the circles of censorship, we see a trend moving from the centre of the editing of information to the outer circle of users. In the circles between those two extremes we find organisations that are more and more monitored. In the chain of distribution, we encounter booksellers and telecommunications companies, traditionally the object of surveying practices.

In the trend towards the direction of the user, the new companies offering access to information (traditionally: libraries) are Internet service providers and search engines. These are more and more the target to implement filtering methods. But the ultimate goal is the user who leaves his or her electronic imprint on the Internet wherever he or she goes — most of all in the electronic records of telecommunication companies and access providers.

The freedom of the editor’s centre was won in Europe in the 18th and 19th century, with the abolition of censorship. It created a mentality in the top of the editing companies to fight for principled reasons any threatening interference by authorities. Distribution companies and especially telecommunications companies, developed a different frame of mind: they did not wish to be bothered with questions related to the content of the information they transport, which they consider to be secret.

They are prepared to fight for the protection of that secrecy, but not of the protection of the content and not at any price. The new victims of censorship, the ISP’s and search engines, find themselves in a hybrid position in between the two, with an inclination towards the telecom operator’s frame of mind.

The more commercial amongst them are therefore prepared to give in to pressures of authorities to implant filtering techniques in order to conduct there access and transportation business without to many problems. Their approach is pragmatic rather than principled.

I see four main reasons why we should oppose filtering as much as we should:

• Lack of proportionality
• Lack of transparency
• Lack of due process
• Technological ‘greed’

Filtering is often both ‘overbroad’ and ‘underbroad’. By both we mean that filtering is not very precise: it hits too many content or not all the content we wish to restrict. One of the basic principles of the restriction of human rights (and freedom of expression is a human right) is that, if it is legitimate, it ought to be proportionate: effective and not unnecessary heavy. Filtering methods most of the time do not meet this test.

‘Blacklisting’ and ‘black-labeling’ lack transparency as much as the secret Vatican Index of old. Nobody knows what is on the lists and why it is on it. The second intransparency consists in the emerging forms of cooperation between the states and private companies providing access to information and delivering content to the user. The forms of control through technique and so called ‘self regulation’ create a myriad of rules outside the framework of the state which used to be governed by the rule of law.

The more national states and international organisations loose control over the flows of information the more they are in favour of self regulation deals with companies under their jurisdiction in order to protect their fading territories.

The so called ‘notice and take down’ procedures that aim to eliminate illegal and undesirable content from the Internet are effectuated by the authorities through persuasion and willing cooperation of the companies concerned, outside any judicial control.

The fast developing communications technology increases by the day the abilities to monitor individual communications behaviour. This makes state authorities greedy to control anything technology enables them to control. There is not only a surging practice in fact and in law to put increasing burdens on companies to keep private records for a long time, but also to widen the scope of the legal instruments to get access to these records.

The paradox of our times is that a society which considers personal freedom as one of its highest values more and more becomes a society of control.

Egbert Dommering is professor of Information Law at the University of Amsterdam. He was speaking at a conference on Neo-Censorship to mark the Amsterdam World Book Capital / International Publishers Association 2008 IPA Freedom to Publish Prize in September 2008. His comments draw from the report Access Denied, published by MIT Press, Cambridge, Massachusetts in 2008, edited by Ronald Deibert, John Palfrey, Rafal Rohozinski & Jonathan Zittrain