“The Online Safety Bill will fundamentally undermine rights to freedom of expression”
Index and other organisations ask UN Special Rapporteurs to intervene on proposed UK legislation
16 Nov 22

Online photo: Robinraj Premchand/Pixabay

Irene Khan
UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, by email.

CC – Dr. Ana Brian Nougrères, UN Special Rapporteur on the right to privacy, by email.


Re: Concerning developments for human rights online in the UK

Dear Ms Khan,

We are writing to you regarding the UK Government’s Online Safety Bill; legislation which we believe will fundamentally undermine the rights to freedom  of expression, privacy and other human rights online in the UK.

We note that in March 2022 you wrote to the UK Government expressing concerns about this Bill (OL GBR 5/2022). In your correspondence you stated:

“I believe the proposed Bill, as currently drafted, contains some key provisions that could undermine its overall objective as well as international human rights principles.”

Regrettably, since this correspondence, the legislation has in no way been materially improved to protect human rights. This is despite the fact that it has now been laid before Parliament and has been subjected to considerable revisions. We therefore urge you to use your mandates to provide recommendations to the UK Government to amend the following concerning aspects of the Bill:

The Bill will lead to the restriction of speech considered “legal but harmful”.

We are particularly concerned over the provisions of the Bill which will place pressure on the largest platforms to restrict content the government has designated to be “harmful” (clause 13). In your correspondence to the UK Government of 14 March 2022 you said of this obligation:

“The duty of care placed upon online providers to protect users of their services against legal but harmful content uses vague terms that are open to broad interpretation, such as “reasonably identify”, “material risk”, “significant adverse physical or psychological impact”, “ordinary sensibilities”, and so risks undue removal of content.”

Despite further revisions to the legislation, the Bill continues to place an obligation on online intermediaries to address content which is “legal but harmful”. Where speech of this nature was given a definition in the draft Bill, no such definition exists in the full Bill. An indicative list of possible categories of harmful content was issued by the government in June this year.1 However, this list is not legally binding and cannot be considered sufficiently precise to meet the legality requirement under international human rights standards. Rather, the power to designate speech which is “legal but harmful” lies within the power of the Secretary of State, which creates scope for political censorship and seriousl compromises the independence of the regulatory framework.

As you noted in your letter, the Bill provides little detail of the obligations regarding harmful content and providers could be required to make subjective assessments of its potential impact. Consequentially, forms of speech which are permitted in the offline world and are protected under international human rights law would be censored online, creating two different standards of permissible speech.

The Bill will mean online platforms, not courts, enforcing UK law

In your letter to the UK Government of 14 March 2022, you noted your concerns about the obligations the Online Safety Bill places on platforms to perform functions – namely the duty to remove illegal content – which should be the preserve of law enforcement bodies and independent courts. You stated:

“I am concerned that this obligation delegates to private companies a responsibility that should be exercised by law enforcement, particularly for offences where the boundary between offensive but legal and illegal conduct may be difficult to discern, such as hate crime.”

The Bill continues to require online platforms to determine whether the speech of people in the UK is legal or not and then remove it if they believe it is illegal, undermining the rule of law (clause 9). Online platforms will inevitably turn to machines, not trained people, which are unable to make such nuanced and difficult legal assessments. As a further development of this obligation, platforms will now have a duty under clause 9 of the Bill to “prevent” content (and not only limit its visibility as required under the draft Bill) that they “reasonably consider” could be what the Government describe as “priority illegal content”. Such priority content is defined in Schedules 5,6, and 7 of the Bill, as a list of criminal offences. The list includes provisions from public order and anti-terror legislation which would set the legal limits of legitimate expression. We are concerned that the new language in clause 9 could push platforms to use “upload filters” and risks collateral censorship on a large scale.

Private actors should not be tasked with making such decisions over the legality of people’s behaviour. This is the role of transparent, independent and accountable public authorities such as courts. However, the Bill does nothing to ensure that the police and courts are properly resourced to prosecute, convict, and sentence those who break the law online, depriving victims of justice.

The Bill compromises end-to-end encryption of private messages

In your letter of 14 March, you noted the importance of the right to privacy as a right which also reinforces protection of the right to freedom of expression. You stated:

“I am concerned that the inclusion of direct private messaging within the scope of the Bill could impact negatively on encryption, security and privacy. I have similar concerns regarding Ofcom’s ability to compel a service to use technology to detect child sexual exploitation and abuse (CSEA) and terrorism content on private and public channels and CSEA content on private communication channels”.

The latest version of the Bill continues to bring encrypted chat services into scope via a definition of content as anything that is “communicated publicly or privately”. The obligations on services mandated by the Bill could be imposed on providers of encrypted messaging services via an enforcement power handed to the regulator, Ofcom, without any further judicial or administrative oversight. This would allow Ofcom to mandate that a service use government-“accredited technology” to surveil private channels, even if they are protected by end-to-end encryption (clause 104).

Encryption tools have become vital for individuals to communicate securely. This is particularly true for human rights defenders, journalists, whistleblowers, victims of domestic abuse or individuals from marginalised groups. Undermining these individuals’ ability to communicate privately and securely would threaten both their safety and their right to freedom of expression.

Your other concerns regarding a media exemption; the lack of any quality standards required for the internal complaints mechanisms; the scale of fines that could be imposed on providers; the inadequate requirement on providers to “have regard” to freedom of expression; and the excessive powers granted by the Bill to the Secretary of State; have also not been addressed.

We believe an intervention from you on the legislation would be timely. The Bill has nearly completed its passage through the House of Commons and will soon enter the House of Lords for further consideration. However, in recent weeks, the Bill has been paused for further review.

We urge you to issue a statement or communicate your concerns with the Government, recalling their long-standing obligations in international and domestic law, and their recent pledges to defend freedom of expression in the UK and abroad.

Yours sincerely,
Mark Johnson – Big Brother Watch
Barbora Bukovská – ARTICLE 19
Sam Grant – Liberty
Dr Monica Horten – Open Rights Group
Daniel Pryor – Adam Smith Institute
Ruth Smeeth – Index on Censorship