Widgetized Section

Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone

Web 2.0: Don’t shoot the messenger

By Marta Cooper / 24 August 2012


Search engines and social networking sites are at the heart of Web 2.0. To unreasonably threaten them with liability for user content misses the point, says Marta Cooper

All human life can be found on the web. Content found online will range from liberating to offensive. Some will be copper-bottomed truth, some will be rumour, and there will be a fair amount of LOLcats in between.

Social media sites and search engines accelerate and facilitate the sharing of content. Crucially, this content is not created by the host but by the user: ladies and gents, the much heralded Web 2.0.

So it’s problematic when governments and individuals ask intermediaries — internet service providers (ISPs) and content providers — to remove certain content they are hosting. In India, Twitter said this week it is “co-operating” with the government after the prime minister’s office complained to the website about six accounts that parody PM Manmohan Singh. The ministry of telecommunications has since requested ISPs block the accounts.

Coincidentally on Wednesday the US called on India to respect internet freedom, responding to a recent clampdown on social media websites India blames for adding to tensions between Muslim and northeastern communities in Assam. Under government pressure, Facebook pledged to remove content, block pages or even disable accounts of users who upload content that incites violence or perpetuate hate speech.

These cases highlight the Indian approach of taking “sensitive” content up directly with internet intermediaries while also blocking sites directly via their ISP. Failure to comply could land companies with fines or possible jail time as part of 2011 guidelines under which Internet companies are expected to remove content that regulators deem “grossly harmful” “harassing” or “ethnically objectionable” within 36 hours.

This week it happened to be India caught in the fray, but this is a global situation. The case of Thai webmaster Chiranuch Premchaiporn originally facing 20 years in prison on 10 counts of lèse majesté shows how making intermediaries liable takes us into shoot the messenger territory. In May Premchaiporn was convicted by the Bangkok Criminal court and sentenced to a fine and a suspended eight month prison term for failing to act quickly enough to remove user comments that were defamatory of the Thai monarchy.

Ex-Formula 1 boss Max Mosley’s claim that the “really dangerous thing are the search engines” misunderstands entirely the function of search engines. They are not publishers. But Mosley, who sued the now-defunct News of the World in 2008 for breach of privacy — told the Leveson Inquiry into press standards last November he was pursuing litigation action in 22 countries and suing Google in France and Germany. He added he was considering bringing proceedings against the search engine in California in an attempt to remove certain search results.

If a platform is hosting illegal content or the content is not complying with the platform’s specific terms of reference, it is to be expected that intermediaries would work with government authorities to remove it. Companies such as Google and Twitter have gone to lengths to be transparent about takedown requests. In January, Twitter adopted a new policy of censoring tweets that could violate local laws (the first government to publicly endorse this was, coincidentally, Thailand, not exactly a stranger to censorship). While there was some backlash, it seems that the microblogging site was simply making the best out of a bad situation.

Besides missing the point by not identifying the content’s source, clamping down on intermediaries leads to their erring on the side of caution to limit their liability. Ergo, greater self-censorship: In a study published last year, the Bangalore Centre for Internet and Society sent “legally-flawed” takedown notices to seven intermediaries. Six of them “over-complied” with the notices.

Then there is the impracticality of all this. Over 72 hours of video are uploaded to YouTube every minute. Twitter saw a 182 per cent increase in the number of mobile users from March 2010 to March 2011. In 2010, the average number of tweets people sent per day was 50 million; a year later it was 140 million. Last year, 200 million users were added to Facebook. With these numbers, it is questionable whether these corporations have the manpower — from lawyers to engineers — to regularly sift through all of their content (hat tip to Vladimir Radunovic at DiploFoundation for these stats).

We won’t get to Web 3.0 if intermediaries were to pre-screen content so as not to be held liable for it under sweeping terms. The internet would no longer be a rich and innovative space, and governments and individuals would be achieving little more than shooting the messenger.

Marta Cooper is an editorial researcher at Index. She tweets at @martaruco

Tags: | | | | |

2 Responses to Web 2.0: Don’t shoot the messenger

  1. Pingback: HTML5-CSS Website Design | How-to: REST Web services demystified

  2. Culture Critic

    29 August at 07:05

    I believe that while Google may be transparent about legal requests to remove content from their site that they are not transparent at all and even outright duplicitous about the fact that they have an editorial policy that privileges webpages based on their political, social, or economic content.

    I created /r/googlecensorship subreddit on reddit where people can get together and report this form of censorship which you can get to by clicking my user name “Culture Critic”.

More in Asia and Pacific, Europe and Central Asia, News and features
Five bizarre blasphemy cases