{"id":27717,"date":"2011-10-11T11:13:12","date_gmt":"2011-10-11T10:13:12","guid":{"rendered":"http:\/\/www.indexoncensorship.org\/?p=27717"},"modified":"2017-07-21T17:19:16","modified_gmt":"2017-07-21T16:19:16","slug":"web-filtering-keeping-it-clean","status":"publish","type":"post","link":"https:\/\/www.indexoncensorship.org\/newsite02may\/?p=27717","title":{"rendered":"Web filtering: Keeping it clean?"},"content":{"rendered":"<p><a href=\"http:\/\/www.indexoncensorship.org\/newsite02may\/wp-content\/uploads\/2011\/10\/block-porn140140.gif\"><img decoding=\"async\" loading=\"lazy\" class=\"alignright size-full wp-image-27929\" title=\"block-porn140140\" src=\"http:\/\/www.indexoncensorship.org\/newsite02may\/wp-content\/uploads\/2011\/10\/block-porn140140.gif\" alt=\"\" width=\"140\" height=\"140\" \/><\/a><em>David Cameron has announced plans to block access to pornography online, with providers offering the choice to turn on a filter.<\/em><\/p>\n<p><em> In a 2009 edition of Index on Censorship magazine Seth Finkelstein examines how indiscriminate blocking systems can be a source of censorship<\/em><\/p>\n<p><!--more--><\/p>\n<p><strong>Obscenity online is posing some of the greatest challenges to free speech advocates and censors. Seth Finkelstein explains why<\/strong><\/p>\n<p>When people talk of a topic such as obscenity, they almost always treat it as an intrinsic property, as if something either is, or isn\u2019t, obscene (just as a woman is or isn\u2019t pregnant). But in fact, in the US, definitions vary from state to state \u2013 enshrined in law as \u201ccommunity standards\u201d &#8212; which means that obscenity is a geographic variable, not a constant. Something cannot be legally adjudicated obscene for all the world, but only within a particular community. And standards can vary widely between, say, cities such as New York or San Francisco, versus Cincinnati or Memphis.<\/p>\n<p>This has profound implications for obscenity on the Internet and for censorship.<\/p>\n<p>In the case of \u00a0<a href=\"http:\/\/scholar.google.co.uk\/scholar_case?case=7957202946917009561&amp;hl=en&amp;as_sdt=2&amp;as_vis=1&amp;oi=scholarr\">Nitke v. Ashcroft<\/a>, in which I served as an expert witness, a court tried to grapple with these difficulties and found them daunting. In 2001, Barbara Nitke, an American photographer known for her erotic portraits of the BDSM [bondage, domination, sadomasochism] community, filed a lawsuit challenging the constitutionality of the Communications Decency Act &#8212; a federal statute that prohibits obscenity online.<\/p>\n<p>Nitke argued that the Internet does not allow speech to be restricted by location. Yet anyone posting explicit material risks prosecution according to the standards of the most censorious state in the country. Nitke claimed that this violated her First Amendment rights. She lost the case in 2005: the court ruled that she had presented insufficient evidence to convince the judges of her argument.<\/p>\n<p>The question of definitions is also fundamental to government attempts to censor obscene material online. The most popular method of attempting regulation of obscenity is secret blacklists in the shape of \u201ccensorware\u201d, often relying heavily on purely algorithmic determinations.<\/p>\n<p>Censorware is software that is designed and optimised for use by an authority to prevent another person from sending or receiving information.<\/p>\n<p>This fundamental issue of control is the basis for a profound free speech struggle that will help define the future shape of worldwide communications, as battles over censorship and the Internet continue to be fought.<\/p>\n<p>The most common censorware programmes are huge secret blacklists of prohibited sites, and a relatively simple engine that compares sites attempting to be viewed with some of the blacklists. There are some more exotic systems, but they have many flaws and are beyond the scope of this article (though I\u2019ll note that software that claims to detect \u201cflesh tones\u201d typically has a very restrictive view of humanity). While blacklists related to sexual material garner the lion\u2019s share of attention, it\u2019s possible to have dozens of different blacklists. For example, \u201chate speech\u201d is another contentious category.<\/p>\n<p>Note I do not use the word \u201cfilter\u201d. I believe once you concede the rhetorical framework of \u201cfilters\u201d and \u201cfiltering\u201d, you have already lost. This is not a matter of mere partisan politics. Rather, there\u2019s an important difference in how the words used may channel thought about the issue. To talk of a \u201cfilter\u201d conjures up a mental image of ugly, horrible, toxic material that is being removed, leaving a clean and purified result &#8212; eg a coffee filter or a dirt filter. One just wants to throw the ugly stuff away. Now consider if we have wide-ranging disagreements on the differences between what is socially valuable erotica, tawdry but legal pornography, and illegal obscenity \u2013 how could a computer programme ever make such artistic distinctions?<\/p>\n<p>Crucially, censorware blacklists do not ordinarily encompass legal matters such as community standards or even the criterion \u201csocially redeeming value\u201d. They do not take into account geographic variation at the level of legal jurisdictions. There is no due process, no advocacy for the accused, no adversary system. Everything is done in secret. Indeed, examining the lists themselves is near impossible, since they are frequently hidden and\/or encrypted.<\/p>\n<p>In 1995, I was the first person to decrypt these blacklists, and found they were not just targeting commercial sex sites, but some had also blacklisted feminism, gay rights, sex education and so on. While this was a revelation in general, especially to various interested parties who were touting censorware as a saviour in complicated politics, I was not personally surprised. I viewed it as an inevitable historical outcome when censorminded people are given free reign to act without accountability.<\/p>\n<p>I was eventually forced to abandon the decryption research due to lack of support and rising legal risk. But the lessons of what I, and later others, discovered, have a direct bearing on current debates surrounding national censorware systems. Although ordinarily promoted as covering only obscenity and other illegal material, without checks and balances there can be no assurance against mission creep or political abuse. When there\u2019s no ability to examine decisions, the relatively narrow concept of formal obscenity can become an expansive justification for wide-ranging suppression.<\/p>\n<p>There is much more of an interest in sex than in rebelling against dictatorships<\/p>\n<p>Blocking material that is considered obscene also has wider repercussions for free speech \u2013 and its regulation. If a control system can actually prevent teenagers in the West from getting sex-related content, it will also work against citizens in China who want to read political dissent. And inversely, if political dissenters can escape the constraints of dictatorial regimes, teenagers will be able to flout societal and parental prohibitions. It\u2019s worth observing that those who argue that the Chinese freedom-fighters are morally right, while teenagers interested in sex are morally wrong, are not addressing the central architectural questions.<\/p>\n<p>There is, in fact, much more of an interest in sex than in rebelling against dictatorships. So it\u2019s conceivable that there could be a worst of both worlds result, where authoritarian governments could have some success in restricting their citizens\u2019 access to information, but attempts to exclude masses of sexual information are ultimately futile.<\/p>\n<p>Furthermore, there is an entire class of websites dedicated to preserving privacy and anonymity of reading, by encrypting and obscuring communications. These serve a variety of interests, from readers who want to leave as little trail as possible of their sexual interests, to dissidents not wanting to be observed seeking out unofficial news sources. The many attempts by dictatorial regimes to censor their populations have spurred much interest in censorware circumvention systems, especially among technically minded activists interested in aiding democratic reformers. Sometimes, to the chagrin of those working for high-minded political goals, the major interest of users of these systems is pornography, not politics. But that only underscores how social issues are distinct from the technical problem.<\/p>\n<p>Recall again the importance of how the debate is framed. If the question is \u201cResolved: Purifiers should be used to remove bad material\u201d, then civil libertarians are already at a profound disadvantage. But if instead the argument is more along the lines of \u201cResolved: Privacy, anonymity, even language translation sites, must not be allowed, due to their potential usage to escape prohibitions on forbidden materials\u201d, then that might be much more favourable territory for a free speech advocate to make a case.<\/p>\n<p>In examining this problem, it\u2019s important not to get overly bogged down in a philosophical dispute I call the \u201ccontrol rights\u201d theory versus the \u201ctoxic material\u201d theory. Many policy analysts concern themselves with working out who has the right to control whom, and in what context. The focus is on the relationship between the would-be authority and the subject. In contrast, a certain strain of moralist considers forbidden fruit akin to a poisonous substance, consumption of which will deprave and corrupt. It is the information itself that is considered harmful.<\/p>\n<p>Adherents of these two competing theories often talk past one another. Worse, \u201ccontrol rights\u201d followers sometimes tell \u201ctoxic material\u201d believers that the latter should be satisfied with solutions the former deem proper (e.g. using censorware only in the home), which fails to grasp the reasoning behind the divide of the two approaches.<a href=\"http:\/\/www.indexoncensorship.org\/newsite02may\/wp-content\/uploads\/2009\/04\/index-on-censorship-obscenity.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"alignright size-full wp-image-1912\" title=\"index-on-censorship-obscenity\" src=\"http:\/\/www.indexoncensorship.org\/newsite02may\/wp-content\/uploads\/2009\/04\/index-on-censorship-obscenity.jpg\" alt=\"\" width=\"100\" height=\"147\" \/><\/a><\/p>\n<p><strong>This article originally appeared in Index on Censorship magazine&#8217;s issue on obscenity, &#8220;I Know it When I See it&#8221; \u00a0(Volume 38, issue 1 2009). <a href=\"http:\/\/www.indexoncensorship.org\/newsite02may\/subscribe\/\">Click here to subscribe to Index on Censorship<\/a><\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>David Cameron has announced plans to block access to pornography online, with providers offering the choice to turn on a filter.<br \/><\/br> <strong>Seth Finkelstein<\/strong> examines how indiscriminate blocking systems censor not just pornography, but feminist, gay rights and education material<\/p>\n","protected":false},"author":14,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","_mi_skip_tracking":false},"categories":[4883,8890,581,21],"tags":[687,3539,106,3900,2469,3888],"_links":{"self":[{"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/posts\/27717"}],"collection":[{"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/users\/14"}],"replies":[{"embeddable":true,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=27717"}],"version-history":[{"count":25,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/posts\/27717\/revisions"}],"predecessor-version":[{"id":27835,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=\/wp\/v2\/posts\/27717\/revisions\/27835"}],"wp:attachment":[{"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=27717"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=27717"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.indexoncensorship.org\/newsite02may\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=27717"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}