Debate: The real problems with the Communications Data Bill may not be what you think

Any extension of state powers of surveillance are — rightly — hotly contested. The current Data Communications Bill is no exception. There are problems with this bill — but maybe not the ones you’ve heard of.

Almost universally, it has been labelled the ‘snoopers charter’ by its opponents, representing an enormous encroachment of state spying into the lives of innocent citizens.  Journalists are outbidding each other in their vitriol toward it, usually calling on Orwell. One example from many is Index’s Mike Harris in the Independent: “This proposed scale of state surveillance will add the UK to the ranks of countries such as Kazakhstan, China and Iran.”

This, to me, is misleading. Yes, China, Iran, and Kazakhstan use “Deep Packet Inspection”, which this Bill proposes. But we also bug citizens’ homes — far more intrusive. What matters is the way it is regulated. There is a difference between governments that pass surveillance laws through a vote of elected representatives of those that will be monitored, and governments that do not.

Nor is it about mass surveillance by the state. This Bill is asking/demanding/paying communications companies to collect and retain data on the existence of people’s communications for 12 months, so that in the event that a request is made for that information, it is available.

Crucially, the state only accesses this information when a successful application is made through the existing Regulations of Investigatory Powers Act 2000. This does not include the content of a communication — which has to go through a more stringent process of access. In that respect, not so much has changed, because this all happens already, it’s just that rather often, the information the police want is not there. (And in case no-one noticed, little brother is already miles ahead of what Big Brother is doing.)

That is not to say that the bill is perfect. Four changes would improve it considerably.

First: clarity.  All infringements on our civil liberties need to be based on some kind of public understanding and consent that the measures being taken are proportionate and necessary. But the Bill is vague, the technology complicated, some specifics necessarily secretive. Is should be far more explicit: this would allow for at least an informed debate about whether the measures proposed are necessary and proportionate.

Second, given the value of the Internet to the economy and society (something RIPA is pledged to defend); and the potential misuse of modern technology – including the difficulty of splitting content from communication — only the very strictest system of oversight and redress will do here. More is needed.

Third, the root of RIPA is that the more serious the intrusion, the fewer agencies can do it, and for fewer purposes. RIPA makes a distinction between content and communications data — the latter being considers far less intrusive, and so much easier to obtain.  But when RIPA was passed, communications data used to be mainly be about who you phoned and when. Now it means what websites you visit, where you are, and whom you email. Therefore a new category for this ‘use’ data may need to be created. The authorisation for accesses should be higher than the current bill proposes, but lower than the Home Secretary signing if off, as with content intercept, ideally a warrant from an independent magistrate.

The final problem troubles me most. It is now far easier for the state to access personal information that we citizens happily put into the public domain. Twitter can be mined in real time, open source Facebook groups can be monitored, networks and relationships contructed: all outside the RIPA legislation. None of this is mentioned in the new bill — but I think it is this that worries the public and many journalists. As I argued in #intelligence this type of widespread, mass social media monitoring needs to regulated, limited, and put on a legal footing. The bill is a chance to tackle this tricky problem: otherwise it could make the current furore seem like a minor skirmish.

Jamie Bartlett is Head of the violence and extremism programme at the UK think-tank, Demos, and Director of the Centre for the Analysis of Social Media. Follow him @JamieBartlett

DEBATE: Index’s Mike Harris on the Comms Data Bill and surveillance

In Britain, the government is proposing legislation (the Communications Data Bill) that will grant the Home Secretary the power to blanket retain data on every citizen for an undefined purpose. It won’t require judicial approval — but potentially every text message, every Facebook message, every phone call, every email from everyone in Britain would be stored on behalf of Her Majesty’s Government. If the Bill passes, companies will have to collect data they don’t currently collect and the Home Secretary will be able to ask manufacturers of communications equipment to install hardware such as ‘black boxes’ on their products to make spying easier. This proposed scale of state surveillance will add the UK to the ranks of countries such as Kazakhstan, China and Iran. This total population monitoring would break the fundamental principle that a judge and court order is required before the state invades the privacy of its citizens by holding their personal data.

Read the full article here

Read Index on Censorship’s position on the Comms Data Bill here

 

 

Innovation nominees

Recognising innovation and original use of new technology to circumvent censorship and foster debate, argument or dissent

Freedom Fone by Kubatana, mobile phone technology NGO, ZimbabweFreedom Fone

Kubatana is an NGO based in Harare that uses a variety of new and traditional media to encourage ordinary Zimbabweans to be informed, inspired and active about civic and human rights issues. As an organisation, it continuously seeks innovative fixes to the challenges of sharing independent information in Zimbabwe’s restrictive media environment. Freedom Fone is one of Kubatana’s solutions. An open-source software, Freedom Fone helps organisations create interactive voice response (IVR) menus to enable them to share pre-recorded audio information in any language via mobile phones and landlines with their members or the general public. The software is aimed at organisations or individuals wishing to set up interactive information services for users where the free flow of information may be denied for economic, political, technological or other reasons. Freedom Fone is one of the many ways Kubatana reaches across the digital divide to inform and inspire the vast majority of Zimbabweans who do not have regular or affordable internet access.

ObscuraCam, smartphone app, USA

ObscuraCamObscuraCam is a free smartphone application that uses facial recognition to blur individual faces automatically. Developed by WITNESS and the Guardian Project, it enables users to protect their personal security, privacy and anonymity. In 2011 and 2012, uprisings throughout the Middle East have shown the power and danger of mobile video footage. ObscuraCam helps protect activists who fear reprisals but want to safely capture evidence of state brutality. Launched in June 2011 and based in the USA, ObscuraCam is the only facial blurring or masking application that has responded to the concerns of human rights groups, citizen activists and journalists. In addition to obscuring faces, the application removes identifying data such as GPS location data and the phone make and model.

Visualizing.org, data visualisation resource, international

Visualizing.org was created to help make data visualisation more accessible to the general public. It calls itself “a community of creative people making sense of complex issues through data and design… and a shared space and free resource to help you achieve this goal”.Data analysts and graphic designers have set themselves the challenge of sharing a constantly proliferating body of public data in an accessible form. Raw data on its own might as well be censored; visualisation opens the door to open information that otherwise would be left languishing on hard disks or, if downloaded, unintelligible to the average citizen. The project offers a place to showcase work, discover remarkable visualisations and visually explore some of today’s most pressing global issues.  Created by GE and Seed Media Group, Visualizing.org promotes information literacy. The portal has had a remarkable year.

Telecomix, internet activists, across Europe

TelecomixTelecomix is the collective name for a decentralised group of internet activists operating in Europe. Their focus is to expose threats to freedom of speech online. During one operation, Telecomix activists published a huge package of data which proved that the Syrian government was carrying out mass surveillance of thousands of its citizens’ internet usage. Telecomix’s revelation that the technology used was supplied by US firm Blue Coat Systems has prompted serious investigations into the involvement of western technology firms in helping repressive regimes spy on their people. In mid-August 2011, Telecomix’s dispersed group of hackers came together to target Syria’s internet. Those attempting to access the internet though their normal browsers were confronted with a blank page bearing a warning: “This is a deliberate, temporary internet breakdown. Please read carefully and spread the following message. Your internet activity is monitored.” Following this, a page flashed up describing how to take precautions to encrypt usage.

Freedom of Expression Awards 2012

supported by

Russia: Googling anything against the authorities is a crime

The fifth year of Russia’s full-scale war against Ukraine grinds on, with its unvarying backdrop of devastated Ukrainian cities and extensive casualties among the non-combatant population. Meanwhile the Russian authorities exploit the war as justification for constantly tightening the screws of their repressive policies at home.

In the last few years, criminal prosecutions for speaking out have become common, everyday occurrences. The definitions of “extremism” have become increasingly vague, and the pressure applied to the independent media and civil society initiatives has become systemic. Alongside these developments, another, less visible, but equally significant process has been gathering momentum: the restructuring of the digital environment in such a way as to induce people to modify their own behaviour themselves – frequently without even realising.

Six months ago a law came into force in Russia making it an offence punishable by law to search for extremist materials online. This law, which was widely publicised in the media, functions as a “bogeyman”. That is, the security men’s little lamp won’t light up if you have entered “Navalny” in Google, but if they confiscate your computer and discover a search query like that in it, you can be charged with a crime. In the news, however, they don’t tell you about fine details like that. In the news they simply say that those who search for extremist materials online will be punished and that is what remains imprinted on people’s minds – that googling anything against the authorities is prohibited.

For a long time, the Russian state’s approach to control of the internet was overt and unsubtle: ban a site, block a platform, restrict access. This didn’t work well. It annoyed people, provoked resistance and rapidly spawned solutions that bypassed restrictions.

But in the fifth year of a war which, in regions under attack by drones, is accompanied by constant interruptions to mobile internet services, a solution has been found. Whitelisting. The implications of the whitelist model are simple: stable access is only assured to services approved in advance by the state. All the rest can operate, but with outages or restrictions, and without any guarantees.

At the same time, Roscomnadzor (the Federal Supervisory Agency for Information and Communications Technologies) has decided to block calls via WhatsApp and Telegram – and this affects everybody. WhatsApp is the most popular messenger app in Russia, with 96 million users. People, especially the older generation, like it because it is simple. It’s good for everyday and family use, for off-the-cuff calling. Telegram is good for other things: it’s a connection to a field of information, news, politics and alternative points of view. They tried to block it as early as 2018, but when it became clear that direct prohibitions don’t work, the strategy changed. They no longer block apps completely but simply render them inconvenient. And to replace them they offer the “national messenger app” MAX. Celebrities who are loyal to the authorities advertise it on TV and urban billboards. “Great reception even in the car park,” a pro-government female rapper declares as she posts a MAX advertising video in stories, while the other apps beside it can no longer provide any access at all.

MAX is rapidly becoming the compulsory communications channel in schools and nursery schools, universities and colleges, state and municipal institutions, as well as in “house chats” for residents of apartment blocks, facilitated by the management companies. Its introduction is only rarely achieved by means of public command: in most instances it is a case of word-of-mouth instructions and surreptitious pressure – from warnings about “unpleasantness” to threats of disciplinary reprimands or dismissal.

MAX is whitelisted by definition. It is stable in situations where other applications are “temporarily unavailable”. MAX has to be preinstalled on all the mobile devices offered for sale in the country. But MAX is not attempting to become everyone’s “favourite” all at once. It is enough for it to become compulsory. There is no attempt to persuade people – they are simply transferred under the pretext of “convenience”.

MAX’s most crucial characteristic is its profound integration with the platform Gosuslugi (State Services). This is an individual’s digital profile: passport, taxes, fines, medical record, welfare payments. MAX can be used to confirm a person’s identity or age, and it can be used as a digital document – for instance when purchasing alcohol. This changes the very nature of the messenger app. It ceases to be a space for networking and socialising and becomes part of an ID system.

MAX’s very interface suggests that it is the Russian equivalent of the Chinese app WeChat. The Russian authorities are looking to China more and more nowadays – not as a model that can be copied point for point, but as proof that control can be built into everyday reality. The Chinese system doesn’t work by means of incessant prohibitions, but by virtue of people’s habituation to limits. They know in advance what the boundaries are and they act within them. And Russia’s digital policy is gradually leading people in the same direction.

However, WeChat was never designated a “national messenger app”, and people were not herded into it by the threat of being sacked: it defeated the competition on its own terms – thanks to its convenience, ecosystem of services and the early effect of scale. Initially it was simply a messenger app, then a payment instrument, and then a portal to municipal amenities, the media, taxis and state services. The process of habituation was organic, and the infrastructure of control was only constructed around already familiar elements.

MAX was immediately castigated for its aggressive gathering of metadata and wide-ranging requests for permissions – access to contacts, photos, call history, screen – and the absence of end-to-end encryption (E2EE) by default: this means that all messages are saved on servers in readable form, creating the risk of their being accessed by third parties or state agencies.

But it is not the technical details that are most important. The most important thing is the effect: the individual becomes accustomed to the idea that risk, not privacy, is the norm. That it is safer not to discuss anything superfluous. That it is simpler not to ask questions. In this way a new model of social behaviour is taking shape.

Despite the official declarations, MAX has not become massively popular by choice. People use it because they need to. Because otherwise it’s impossible to manage. This is a fundamental difference from messenger apps that have become integrated into life in an organic fashion.

And this is the point at which the most disturbing question of all arises. The war might come to an end, but will the blocking of the mobile internet also end? An infrastructure of social control is rarely temporary. When public money has been invested in it, when it has been built into schools, state institutions and people’s everday activities, it starts living a life of its own. New justifications for it will always be found: security, stability, new threats. Not coercion, but habituation. When social interaction becomes cautious, there is no longer any need for constant intervention by the censor. Censorship is already built into daily life.

In this sense, what is happening now resembles ever more closely Michel Foucault’s theory of the Panopticon – an “open prison” in which control is effected, not by means of constant surveillance, but by the possibility of surveillance. Individuals do not need to know that they are being observed at this moment. It is sufficient for them to be uncertain whether they are. In this system the walls become invisible and discipline becomes internal. A digital infrastructure organised.around whitelists, identification and unstable means of communication reproduces precisely the same logic:  individuals start behaving cautiously, not because they are being punished, but because it’s simply safer that way.

It is also important to note that this behaviour does not remain within the ambit of the application. It is inevitably extrapolated to life offline – to conversations in public spaces, to spontaneous discussion, to the way in which people speak out loud. When communication in digital space becomes cautious and functional, the same model is gradually carried over into ordinary life. The open prison has no need of bars or guards: it inculcates the habit of self-limitation. And that is precisely why such systems remain stable long after the formal reason for their appearance disappears.

SUPPORT INDEX'S WORK