There has been significant commentary on the flaws of the Online Safety Bill, particularly the harmful impact on freedom of expression from the concept of the ‘duty of care’ over adult internet users and the problematic ‘legal but harmful’ category for online speech. Index on Censorship has identified another area of the Bill, far less examined, that now deserves our attention. The provisions in the Online Safety Bill that would enable state-backed surveillance of private communications contain some of the broadest and powerful surveillance powers ever proposed in any Western democracy. It is our opinion that the powers conceived in the Bill would not be lawful under our common law and existing human rights legal framework.
Index on Censorship has commissioned a legal opinion by Matthew Ryder KC, an expert on information law, crime and human rights, and barrister, Aidan Wills of Matrix Chambers. This report (a) summarises the main legal arguments and analysis; (b) provides a more detailed explanation of the powers contained in Section 104 notices; and (c) lays out the legal opinion in full.
The legal opinion shows how the powers conceived go beyond even the controversial powers contained within the Investigatory Powers Act (2016) but critically, without the safeguards that Parliament inserted into the Act in order to ensure it protected the privacy and the fundamental rights of UK citizens. The powers in the Online Safety Bill have no such safeguards as of yet.
The Bill as currently drafted gives Ofcom the powers to impose Section 104 notices on the operators of private messaging apps and other online services. These notices give Ofcom the power to impose specific technologies (e.g. algorithmic content detection) that provide for the surveillance of the private correspondence of UK citizens. The powers allow the technology to be imposed with limited legal safeguards. It means the UK would be one of the first democracies to place a de facto ban on end-to-end encryption for private messaging apps. No communications in the UK – whether between MPs, between whistleblowers and journalists, or between a victim and a victims support charity – would be secure or private. In an era where Russia and China continue to work to undermine UK cybersecurity, we believe this could pose a critical threat to UK national security.
The King’s Counsel’s legal opinion includes that:
● Section 104 notices amount to state-mandated surveillance because they install the right to impose technologies that would intercept and scan private communications on a mass scale. The principle that the state can mandate the surveillance of millions of lawful users of private messaging apps should require a much higher threshold of legal justification which has not been established to date. Currently this level of state surveillance would only be possible under the Investigatory Powers Act if there is a threat to national security.
● Ofcom will have a wider remit on mass surveillance powers of UK citizens than the UK’s spy agencies, such as GCHQ (under the Investigatory Powers Act 2016). Ofcom could impose surveillance on all private messaging users with a notice, underpinned by significant financial penalties, with less legal process or protections than GCHQ would need for a far more limited power.
● Questionable legality: The proposed interferences with the rights of UK citizens arising from surveillance under the Bill are unlikely to be in accordance with the law and are open to legal challenge.
● Failure to protect journalists: if enacted, journalists will not be properly protected from state surveillance risking source confidentiality and endangering human rights defenders and vulnerable communities.
The disproportionate interference with people’s privacy identified by the legal analysis paints an altogether different picture of the Online Safety Bill. Far from being a law to establish accountability for online crime, the legislation, as drafted, opens the door for sweeping new powers of surveillance with little public debate over their purpose and proportionality. Unless the government reconsiders or parliament pushes back, these powers are set on a collision course with independent media and journalism as well as marginalised groups.
Download this new legal opinion on the Online Safety Bill here
[vc_row][vc_column][vc_single_image image=”117686″ img_size=”full” add_caption=”yes”][vc_column_text]As you may have seen from our social media feeds and our website, Index on Censorship is working to ensure MPs and the public are aware of the unintended consequences that may arise from the UK Government’s planned Online Safety Bill.
The Bill is based on the ‘duty of care’ concept, which underpins health and safety law in the workplace. However, there is a huge difference between protecting workers from workplace injury and protecting citizens from harm on the internet at the same times as protecting our fundamental freedom of expression rights.
The Bill has introduced the concept of ‘legal but harmful’ and would give social media platforms the power to remove content that could be considered ‘harmful’ to some people. But who makes that decision? Governments, private companies, an algorithm? Who decides when an idea is harmful but remains legal? Where would we be if the suffragettes had been considered harmful? Where would we be if Pride marches had been considered harmful? Where would we be if the civil rights movement had been considered harmful? This is a fundamentally flawed concept.
We already have laws against child abuse, against hate speech, and against death threats – what we need is not more legislation, but more training and resources for the police and relevant organisations to tackle these crimes. The risk with the Online Safety Bill is that not only are these resources not given to tackle issue of child abuse, but that more freedoms and rights are taken away from people and our democracy threatened.
The EU are now developing their own online legislation along the lines of the Online Safety Bill with their Digital Services Act. Across the world, the dominance of social media is generating real issues for regulation and, particularly, in considerations of who is responsible for what is posted online and what is liable to be taken down. Determining the answers to both of these questions is not a simple process with no simple answer but considerable pitfalls for democratic rights. Failing to answer these questions in hurried legislation is a poor substitute for a considered response to what are legitimate concerns.
Over the next few months, Index will be working with European organisations to raise awareness of the ‘unintended consequences’ of the Digital Services Act that will hopefully also help to inform the debate here in the UK. The internet is worldwide, borders are irrelevant, and we have to ensure that vulnerable and marginalised voices are not erased from our societies. The internet is our new Wild West, but we must be careful of knee-jerk reactions that aim to do some good but end up restricting the freedoms we all value.
We have launched the #OffOn campaign to tell MEPs not to switch off our freedoms online and instead to protect fundamental freedoms of expression while strengthening the rule of law relating to criminal offences.
The aims of this campaign are to:
- Preserve what works and fix what is broken
The internet is still a formidable network that connects and empowers people. Preserving and enhancing fundamental rights must be the cornerstone upon which any legislation is built.
- Limit online regulation to addressing illegal content
Ensure that the process of judicial review is at the core of any adjudication mechanism.
- Support user empowerment and wider participation
Legislation should focus on putting users first by allowing them to have more control over the content they see, the ability to remain anonymous online, the right to end-to-end encryption and the right to be faced with proportionate and fair content moderation practices.
- Ensure due process and legal certainty
The rules applying to the online environment should offer the same due process safeguards as those that apply offline. Arbitration about the legality of content, or its use, often entails long and careful assessments by courts offline, while unrealistic turnaround times are imposed online for the same type of decisions. We must protect the careful balance of the rights at stake, as well as create an environment of legal certainty.
- Promote these principles in international discussions
The principles and objectives we endorse should not apply only to Europeans – they should be at the centre of the EU’s contributions in any discussions in multilateral and bilateral fora it participates in.[/vc_column_text][/vc_column][/vc_row]
As proud members of the LGBTQ+ community, we know first-hand the vile abuse that regularly takes place online. The data is clear; 78% of us have faced anti-LGBTQ+ hate crime or hate speech online in the last 5 years.1 So we understand why the Government is looking for a solution, but the current version of the Online Safety Bill is not the answer – it will make things worse not better.
The new law introduces the “duty of care” principle and would give internet companies extensive powers to delete posts that may cause ‘harm.’ But because the law does not define what it means by ‘harm’ it could result in perfectly legal speech being removed from the web.2
As LGBTQ+ people we have seen what happens when vague rules are put in place to police speech. Marginalised voices are silenced. From historic examples of censors banning LGBTQ+ content to ‘protect’ the public, to modern day content moderation tools marking innocent LGBTQ+ content as explicit or harmful.
This isn’t scaremongering. In 2017, Tumblr’s content filtering system marked non-sexual LGBTQ+ content as explicit and blocked it, in 2020 TikTok censored depictions of homosexuality such as two men kissing or holding hands and it reduced the reach of LGBTQ+ posts in some countries, and within the last two months LinkedIn removed a coming out post from a 16-year-old following complaints.3
This Bill, as it stands, would provide a legal basis for this censorship. Moreover, its vague wording makes it easy for hate groups to put pressure on Silicon Valley tech companies to remove LGBTQ+ content and would set a worrying international standard.
Growing calls to end anonymity online also pose a danger. Anonymity allows LGBTQ+ people to share their experiences and sexuality while protecting their privacy and many non-binary and transgender people do not hold a form of acceptable ID and could be shut out of social media.4
The internet provides a crucial space for our community to share experiences and build relationships. 90% of LGBTQ+ young people say they can be themselves online and 96% say the internet has helped them understand more about their sexual orientation and/or gender identity.5 This Bill puts the content of these spaces at potential risk.
Racism, homophobia, transphobia, and threats of violence are already illegal. But data shows that when they happen online it is ignored by authorities. After the system for flagging online hate crime was underused by the police, the Home Office stopped including these figures in their annual report all together, leaving us in the dark about the scale of the problem. The government’s Bill should focus on this illegal content rather than empowering the censorship of legal speech.
This is why we are calling for “the duty of care”, which in the current form of the Online Safety Bill could be used to censor perfectly legal free speech, to be reframed to focus on illegal content, for there to be specific, written, protections for legal LGBTQ+ content online, and for the LGBTQ+ community to be properly consulted throughout the process.
Stephen Fry, actor, broadcaster, comedian, director, and writer.
Munroe Bergdorf, model, activist, and writer.
Peter Tatchell, human rights campaigner.
Carrie Lyell, Editor-in-Chief of DIVA Magazine.
James Ball, Global Editor of The Bureau Of Investigative Journalism.
Jo Corrall, Founder of This is a Vulva.
Clara Barker, material scientist and Chair of LGBT+ Advisory Group at Oxford University.
Marc Thompson, Director of The Love Tank and co-founder of PrEPster and BlackOut UK.
Sade Giliberti, TV presenter, actor, and media personality.
Fox Fisher, artist, author, filmmaker, and LGBTQIA+ rights advocate.
Cara English, Head of Public Engagement at Gendered Intelligence, Founder OpenLavs.
Paula Akpan, journalist, and founder of Black Queer Travel Guide.
Tom Rasmussen, writer, singer, and drag performer.
Jamie Wareham, LGBTQ journalist and host of the #QueerAF podcast.
Crystal Lubrikunt, international drag performer, host, and producer.
David Robson, Chair of London LGBT+ Forums Network
Shane ShayShay Konno, drag performer, curator and host of the ShayShay Show, and founder of The Bitten Peach.
UK Black Pride, Europe’s largest celebration for African, Asian, Middle Eastern, Latin America, and Caribbean-heritage LGBTQI+ people.