Index relies entirely on the support of donors and readers to do its work.
Help us keep amplifying censored voices today.
On a housing estate, somewhere in north-west London, a dispute said to be between rival groups of young men, apparently rages on. From this quagmire of social deprivation emerges Chinx (OS) who, released from an eight-year custodial sentence at the four-year mark, starts dropping bars like his very life depended on it. And, in a way it does. Because for boys like Chinx, young, black and poor, there is only one way out and that is to become the next Stormzy. Only, two behemoths stand in his way: the Metropolitan Police and their apparent “side man” Meta, parent company of Facebook and Instagram.
In January 2022, Chinx posted a video clip of a drill music track called Secrets Not Safe. Following a request by the Metropolitan Police arguing that the post could lead to retaliatory gang-based violence , Meta removed the post and Chinx’s Instagram account was deleted.
Meta’s decision has now been challenged by the Oversight Board, a quasi-independent adjudicator conceived to police the online giant’s application of its own policies but funded by the company.
The Board recently condemned the company’s decision to remove Chinx’s post and delete his account as not complying with Meta’s own stated values and with wider human rights considerations.
As part of its review of Meta’s decision, the Board made a Freedom of Information Act request to the Met over its requests to remove content from various online platforms. Whilst a good proportion of their responses to the request were unhelpful bordering on obstructive, what it did disclose was troubling.
In the year to the end of May 2022, the Met asked online platforms, including Meta, to remove 286 pieces of content. Every single one of those requests related to drill music. No other music genre was represented. Some 255 of the Met’s requests resulted in the removal of content, a success rate of over 90%.
The decision makes for illuminating, if worrying, reading when one considers the potential chilling impact Meta’s actions may have on the freedom of expression of an already suppressed, marginalised and some would argue, over-policed section of our community. Four areas of concern emerge.
Law enforcement access to online platforms
Instagram, in common with other applications, has reporting tools available to all users to make complaints. Whilst it may be that law enforcement organisations use such tools, these organisations also have at their disposal what amounts to direct access to these online platform’s internal complaints procedures. When law enforcement makes a request to take content down, Meta deals with such a request “at escalation”. This triggers a process of investigation by Meta’s internal specialist teams who investigate the complaint. Investigation includes analysis of the content by Meta to decipher whether there is a “veiled threat”.
This case demonstrates a worrying pattern in my view; namely the level of privileged access that law enforcement has to Meta’s internal enforcement teams, as evidenced by correspondence the Board saw in this case.
Lack of evidence
What became clear during the exposition of facts by the Board was that despite the apparent need for a causal link between the impugned content and any alleged “veiled threat” or “threat of violence” law enforcement advanced no evidence in support of their complaint. In the light of the fact, as all parties appeared to accept, that this content itself was not unlawful, this is shocking.
On the face of it then, Meta has a system allowing for fast-tracked, direct access to their complaints procedure which may result in the removal of content, without any cogent evidence to support a claim that the content would lead to real life violence or the threat thereof.
This omission is particularly stark as, as in this case, the violence alluded to in the lyrics took place approximately five years prior to the uploading of the clip. This five-year gap, as the Board commented, made it all the more important for real and cogent evidence to be cited in support of removal of the content. We ought to remind ourselves here that the Board found that in this case there was no evidence of a threat, veiled or otherwise, of real-life violence.
Lack of appeal
Meta’s internal systems dictate that if a complaint is taken “at escalation” – as all government requests to take down content are, and this includes requests made by the Met Police – this means there is no internal right of appeal for the user. Chinx (OS) and the other accounts affected by this decision had no right to appeal the decision with Meta nor with the Oversight Board. The result is that a decision that, in some cases, may result in the loss of an income stream as well as an erosion of the right to express oneself freely, may go unchallenged by the user. In fact, as Chinx (OS) revealed during an interview with BBC Radio 4’s World at One programme, he was not made aware at any point during the process why his account had been deleted and the content removed.
The Board itself commented that: “The way this relationship works for escalation-only policies, as in this case, brings into question Meta’s ability to independently assess government actors’ conclusions that lack detailed evidence.”
Each of the three shortcomings above revealed by the Board within Meta’s procedures are worrying enough; but, coupled with the disproportionate impact this system has upon black males (the main authors and consumers of this content) it veers dangerously close to systemic racism.
The findings of the Oversight Board’s FOI request on the Met’s activities in relation to online platforms clearly back this up.
The Digital Rights Foundation argues that while some portray drill music as a rallying call for gang violence, it in fact serves as a medium for youth, in particular black and brown youth, to express their discontent with a system that perpetuates discrimination and exclusion.
An insidious and backdoor form of policing
The cumulative effect of Meta’s actions arguably amounts to an insidious and unlegislated form of policing. Without the glare of public scrutiny, with no transparency and no tribunal to test or comment on the lack of evidence, the Met have succeeded in securing punishment (removal of content could be argued to be a punishment given that it may lead to loss of income) through the back door against content that was not, in and of itself unlawful.
As the Board pointed out in their decision, for individuals in minority or marginalised groups, the risk of cultural bias against their content is especially acute. Art, the Board noted, is a particularly important and powerful expression of “voice”, especially for people from marginalised groups creating art informed by their experiences. Drill music offers young people, and particularly young black people, a means of creative expression. As the UN Special Rapporteur in the field of cultural rights has stated, “…representations of the real must not be confused with the real… Hence, artists should be able to explore the darker side of humanity, and to represent crimes… without being accused of promoting these.”
The right to express yourself freely, even if what you say may offend sections of our community, is one of those areas that truly tests our commitment to this human right.
For over fifty years, Index on Censorship has supported dissidents, journalists and activists in part by training them on the most current technology. In recent years that has included how to use encryption and encrypted communication apps, helping them to protect themselves from repressive regimes in the easiest and most comprehensive ways possible. This training was especially necessary when accessing encryption proved to be a specialist pursuit, involving intensive training, helping people on the ground to understand the options and downloading often complex peer-to-peer messaging apps.
Now, thankfully, encryption is everywhere; human rights defenders, journalists and MPs use platforms like Signal, Telegram and WhatsApp to exchange everything from gossip to public interest data. Encryption is critical for investigative journalists who need to communicate with sources and to protect their investigations against hostile actors, whether states or criminal gangs.
And for all of us encryption has its uses: sending family photos and sharing personal information. After all, who hasn’t sent their bank details to a friend?
For Index on Censorship, protecting encryption is a critical frontline in the fight for freedom of expression. Free speech isn’t just about the words themselves: it is the freedom to exchange information, the freedom to gather information and the freedom to confide ideas and thoughts to others without the risk of arrest and detention. Encryption is now central to our collective ability to exercise the right to freedom of expression.
Five years ago, Jamie Bartlett wrote for Index on Censorship about how his experience of police intimidation in Croatia, a democratic EU member state, changed his view on encryption. In Jamie’s case it offered a secure means of communicating with a source who the authorities had made it clear they did not want him to speak to.
Today, in too many states, encryption is now essential. As we speak the reality on the ground in authoritarian regimes including China, Hong Kong, Belarus and Russia, the difference between using an encrypted messaging app to express yourself, or unencrypted communications will mean the difference between freedom and imprisonment, if not worse.
Promoting and defending encryption is essential for any organisation that promotes and defends free speech. That’s why Index on Censorship is delighted to announce that we’ve received a grant from WhatsApp, the messaging app, to support our work in defending encryption. The grant of £150,000 will be used for our general work in defending digital freedom and our work streams will not be determined by any one other than my team at Index. From our perspective this grant is incredibly welcome as it will allow us to develop new content that explains the importance of encryption to the public, allows us to get new legal advice on why encryption should be protected as a fundamental defence of our human rights, as well as bringing new voices into the debate on why encryption is so critical to defend free speech.
As with any grant, the grantee has no influence whatsoever over Index on Censorship policy positions or our work itself. Index has had its criticisms of Meta (WhatsApp’s parent company) in the past and I’m sure we will in the future, and we’ll continue to speak freely to any government or company.
Right now, we’re continuing to argue for a pause to the UK government’s rush to push through its flawed Online Safety Bill, ensuring we have the opportunity to work with Ministers to amend the bill to remove the flawed ‘legal but harmful’ provisions in the legislation (as demolished by Gavin Millar QC’s power legal opinion for Index) and also ensure the potential undermining of encryption is taken out of this legislation.
We’ve got a lot to do – but the political weather is changing in the right direction.