We the undersigned have serious concerns about Part 2 of the Scottish Government’s Hate Crime and Public Order Bill, and increasingly so in light of recent parliamentary deliberations.
Over the last year, there has been a robust debate about Part 2 of the bill, which outlines new offences on the stirring up of hatred. We all condemn crimes motivated by hatred and prejudice. The difficulty with this Bill, in its current form, is its potential to have a wider, negative effect on freedom of expression in Scotland.
When the bill was published last year, the police, the legal profession, academics, civil liberties groups and others cautioned that the offences could catch legitimate debate on a range of issues. The vague wording of the offences and a lack of adequate free speech protections could, they warned, place a chill on free expression in the arts, the media and the public square when it comes to discussions about contentious issues such as religion and trans rights.
After a wide and sustained backlash, the Scottish Government announced several concessions. Most significantly, Ministers conceded that offending should be limited to ‘intent’. It also committed to ‘broadening and deepening’ a free speech clause covering religion and inserting a new clause on transgender identity.
Cabinet Secretary for Justice Humza Yousaf lodged amendments to effect these changes ahead of Stage 2 deliberations by the Justice Committee, which began on 2 February 2021. However, the Cabinet Secretary, in agreement with other MSPs on the Committee, decided to withdraw amendments on freedom of expression at the eleventh hour, saying he would seek ‘consensus’ on a ‘catch-all’ free speech clause, to be drafted ahead of Stage 3.
This move has, in our view, undermined the whole process of scrutiny to date. Amendments to safeguard freedom of expression on religion, sexual orientation and transgender identity – topics that are subject to strong and often controversial debate – were vitally important and agreed upon by the majority of stakeholders who have engaged with parliament over the last 12 months.
Providing separate and robust freedom of expression provisions on these topics was also the approach advocated by Lord Bracadale QC in evidence to the Committee last year. He said: “Such amendments to the bill would be an expression of the kind of line that we want to identify between ‘offensive behaviour’ on one side and ‘threatening and abusive behaviour’ on the other”.
The idea that a workable ‘catch-all’ provision covering these topics, as well as the characteristics of age, disability, and variations of sex characteristics, can be agreed upon by the government and other parties before final, Stage 3 proceedings take place is, frankly, untenable. Manufacturing such a clause over the next few weeks, behind closed doors, will also necessarily preclude the views of parliament, stakeholders and the public from being taken into account.
We strongly believe that producing workable provisions on the stirring up of hatred in this parliament is now entirely impracticable. These provisions could impact upon the most precious liberties in any democratic society: freedom of speech, freedom of expression, freedom of conscience and religion. They must be handled with the utmost care.
We urge MSPs in every party to oppose Part 2 of the Hate Crime Bill and allow other, non-contentious aspects of the bill to proceed without it. New proposals on the stirring up of hatred could be brought forward in the next parliament, where they would be scrutinised thoroughly over time, with renewed input by a wide range of stakeholders.
Sincerely,
Ruth Smeeth, Chief Executive, Index on Censorship;
Emma Webb, Associate Fellow, Civitas;
Ian Murray, Executive Director, Society of Editors;
Peter Tatchell, human rights campaigner;
Jim Sillars, former Deputy Leader, Scottish National Party;
Stephen Evans, CEO, the National Secular Society;
Simon Calvert, Deputy Director, The Christian Institute;
Hardeep Singh, Deputy Director, Network of Sikh Organisations;
Trina Budge, Director, For Women Scot;
Andrew Allison, Head of Campaigns, Freedom Association;
Kapil Summan, Editor, Scottish Legal News;
Dr Kath Murray, Research Fellow in Criminology, Uni. of Edinburgh;
Lucy Hunter Blackburn, researcher and former senior civil servant;
Lisa MacKenzie, independent researcher;
Dr Stuart Waiton, sociologist, Abertay University, Dundee;
[vc_row][vc_column][vc_column_text]Today, Tuesday, the British government has finally responded to its own consultation on Online Harms. Our role at Index on Censorship is to defend free expression and free speech for all citizens wherever they live. This includes in the UK.
Index has significant concerns about the government’s proposals and their unintended consequences on our collective right to free speech. We are concerned about the global impact of these proposals and the message that is being sent by the British government – by instituting restrictive policies for social media companies – to repressive regimes who relentlessly seek to undermine the rights of their citizens.
While acknowledging that there are problems with regulation of online platforms, Index will be engaging with policy makers to try and make this legislation better in protecting our right to free expression.
Our key concerns are:
Legal but harmful
The British government is proposing a new classification of speech. Legal but harmful content, such as abuse, would be deemed illegal online but would be perfectly acceptable offline. A lack of consistency in our legal framework for speech is ludicrous and would have significant unintended consequences.
Emphasis on the platforms not the perpetrators
The penalties outlined in these proposals focus on the role of the platforms to regulate their online spaces – not their customers who seemingly have limited personal responsibility. It also fails to acknowledge that this is a cultural problem and therefore needs a carrot as well as a stick.
No one is going to be fined for deleting too much content
The proposals will fine social media companies for not complying with the new regulatory framework. Although ministers have issued warm words about protecting freedom of speech it seems highly unlikely that a platform would be sanctioned for deleting too much content, leaving social media companies to always err of the side of caution and delete challenging content even if it isn’t contravening the legislation.
Digital evidence locker
These proposals seemingly advocate the permanent removal of significant amounts of content, thus curtailing a victim’s ability to prosecute, as once deleted by a platform there is no way to retrieve the content even by law enforcement. This includes evidence of terrorism atrocities; 23% of the Syrian War Crime Archive has already been deleted by the platforms. The lack of legal protections in place for the platforms to store this content (out of sight) for access by law enforcement, journalists and academics results in a lack of prosecution and analysis. Index believes a compromise would be the creation of a legal framework to allow social media platforms to create Digital Evidence Lockers.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][/vc_column][/vc_row]
[vc_row][vc_column][vc_column_text]The Rt Hon Jeremy Wright QC MP
Secretary of State for Digital, Culture, Media and Sport
100 Parliament Street
London
SW1A 2BQ
1 July 2019
Re: Online Harms White Paper
Dear Secretary of State,
We write as a group of organisations keenly interested in the government’s proposals for Internet regulation. We recently convened a day-long multi-stakeholder workshop to discuss the implications of the 2019 Online Harms White Paper and write to share the conclusions and findings from that event.
Organisations represented at the workshop included human rights NGOs, social media platforms, telecoms and media companies, news media, industry associations, parenting and child rights organisations, academia, think tanks, government departments and independent regulators. The aim was to bring together representatives from all relevant sectors, discuss differences of opinion and find areas of consensus.
One unanimous finding from the day was that “there is a need for a systematic approach to dealing with problematic content online, but the group did not support the adoption of a ‘duty of care’ approach”. Many participants noted that the concept of duty of care does not translate well from the offline to the online context, and as such it provides little clarity as to what duties can and should be expected of companies within scope of the OHWP.
Another key finding of the workshop was that all parties involved felt that whilst government departments had conducted outreach through this process, no exercise conducted by government had brought together all of the key groups in this process (including civil society organisations, childrens’ charities, media companies, global tech giants, British startups, and UK media/press) in a coherent way.
We believe that this risks resulting in a process dominated by some stakeholders and where policy is developed without a full overview of where stakeholders’ concerns and consensus really lie. We urge that after the formal consultation period closes, you consider acting to convene a comprehensive meeting with all relevant stakeholders formally to discuss key elements of the proposals and map a way forward for the proposals.
We welcome this opportunity to continue to engage with the government and look forward to your response.
Yours sincerely,
Oxford Internet Institute
Open Rights Group
Global Partners Digital
Index on Censorship
The Coalition for a Digital Economy
Cc Secretary of State for the Home Department, The Right Hon. Sajid Javid MP[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_basic_grid post_type=”post” max_items=”4″ element_width=”6″ grid_id=”vc_gid:1561990413053-0b824c89-2f78-8″ taxonomies=”4883″][/vc_column][/vc_row]
Parliament must be fully involved in shaping the government’s proposals for online regulation as the proposals have the potential to cause large-scale impacts on freedom of expression and other rights.
The proposed duty of care needs to be limited and defined in a way that addresses the risk that it will create a strong incentive for companies and others to censor legal content, especially if combined with fines and personal liability for senior managers.
It is important to widen the focus from harms and what individual users do online to the structural and systemic issues in the architecture of the online world. For example, much greater transparency is needed about how algorithms influence what a user sees.
The government is aiming to work with other countries to build international consensus behind the proposals in the white paper. This makes it particularly important that the UK’s plans for online regulation meet international human rights standards. Parliament should ensure that the proposals are scrutinised for compatibility with the UK’s international obligations.
More scrutiny is needed regarding the implications of the proposals for media freedom, as “harmful” news stories risk being caught.
Introduction
The proposals in the government’s online harms white paper risk damaging freedom of expression in the UK, and abroad if other countries follow the UK’s example.
A proposed new statutory duty of care to tackle online “harms” combined with substantial fines and possibly even personal criminal liability for senior managers would create a strong incentive for companies to remove content.
The “harms” are not clearly defined but include activities and materials that are legal.
Even the smallest companies and non-profit organisations are covered, as are public discussion forums and file sharing sites.
The proposals come less than two months after the widely criticised Counter-Terrorism and Border Security Act 2019. The act contains severe limitations on freedom of expression and access to information online (see Index report for more information).
The duty of care: a strong incentive to censor online content
The proposed new statutory duty of care to tackle online harms, combined with the possibility of substantial fines and possibly even personal criminal liability for senior managers, risks creating a strong incentive to restrict and remove online content.
Will Perrin and Lorna Woods, who have developed the online duty of care concept, envisage that the duty will be implemented by applying the “precautionary principle” which would allow a future regulator to “act on emerging evidence”.
Guidance by the UK Interdepartmental Liaison Group on Risk Assessment (UK-ILGRA) states:
“The purpose of the Precautionary Principle is to create an impetus to take a decision notwithstanding scientific uncertainty about the nature and extent of the risk, i.e. to avoid ‘paralysis by analysis’ by removing excuses for inaction on the grounds of scientific uncertainty.”
The guidance makes sense when addressing issues such as environmental pollution, but applying it in a context where freedom of expression is at stake risks legitimising censorship – a very dangerous step to take.
Not just large companies
The duty of care would cover companies of all sizes, social media companies, public discussion forums, retailers that allow users to review products online, non-profit organisations (for example, Index on Censorship), file sharing sites and cloud hosting providers. A blog and comments would be included, as would shared Google documents.
The proposed new regulator is supposed to take a “proportionate” approach, which would take into account companies’ size and capacity, but it is unclear what this would mean in practice.
Censoring legal “harms”
The white paper lists a wide range of harms, for example, terrorist content, extremist content, child sexual exploitation, organised immigration crime, modern slavery, content illegally uploaded from prisons, cyberbullying, disinformation, coercive behaviour, intimidation, under 18s using dating apps and excessive screen time.
The harms are divided into three groups: harms with a clear definition; harms with a less clear definition; and underage exposure to legal content. Activities and materials that are not illegal are explicitly included. This would create a double standard, where activities and materials that are legal offline would effectively become illegal online.
The focus on the catch-all term of “harms” tends to oversimplify the issues. For example, the recent study by Ofcom and the Information Commissioner’s Office Online Nation found that 61% of adults had a potentially harmful experience online in the last 12 months. However, this included “mildly annoying” experiences. Not all harms need a legislative response.
A new regulator
The white paper proposes the establishment of an independent regulator for online safety, which could be a new or existing body. It mentions the possibility of an existing regulator, possibly Ofcom, taking on the role for an interim period to allow time to establish a new regulatory body.
The future regulator would have a daunting task. It would include defining what companies (and presumably also others covered by the proposed duty of care) would need to do to fulfil the duty of care, establishing a “transparency, trust and accountability framework” to assess compliance and taking enforcement action as needed.
The regulator would be expected to develop codes of practice setting out in detail what companies need to do to fulfil the duty of care. If a company chose not to follow a particular code it would need to justify how its own approach meets the same standard as the code. The government would have the power to direct the regulator in relation to codes of practice on terrorist content and child sexual exploitation and abuse.
Enforcement
The new enforcement powers outlined in the white paper will include substantial fines. The government is inviting consultation responses on a list of possible further enforcement measures. These include disruption of business activities (for example, forcing third-party companies to withdraw services), ISP blocking (making a platform inaccessible from the UK) and creating a new liability for individual senior managers, which could involve personal liability for civil fines or could even extend to criminal liability.
Undermining media freedom
The proposals in the white paper pose a serious risk to media freedom. Culture Secretary Jeremy Wright has written to the Society of Editors in response to concerns, but many remain unconvinced.
As noted the proposed duty of care would cover a very broad range of “harms”, including disinformation and violent content. In combination with fines and potentially even personal criminal liability, this would create a strong incentive for platforms to remove content proactively, including news that might be considered “harmful”.
Index has filed an official alert about the threat to media freedom with the Council of Europe’s Platform to promote the protection of journalism and safety of journalists. Index and the Association of European Journalists (AEJ) have made a statement about the lack of detail in the UK’s reply to the alert. At the time of writing the UK has not provided a more detailed reply.
Censorship and monitoring
The European Union’s e-commerce directive is the basis for the current liability rules related to online content. The directive shields online platforms from liability for illegal content that users upload unless the platform is aware of the content. The directive also prohibits general monitoring of what people upload or transmit.
The white paper states that the government’s aim is to increase this responsibility and that the government will introduce specific monitoring requirements for some categories of illegal content. This gets close to dangerous censorship territory and it is doubtful if it could be compatible with the e-commerce directive.
Restrictions on freedom of expression and access to information are extremely serious measures and should be backed by strong evidence that they are necessary and will serve an important purpose. Under international law freedom of expression can only be restricted in certain limited circumstances for specific reasons. It is far from clear that the proposals set out in the white paper would meet international standards.
Freedom of expression – not a high priority
The white paper gives far too little attention to freedom of expression. The proposed regulator would have a specific legal obligation to pay due regard to innovation. When it comes to freedom of expression the paper only refers to an obligation to protect users’ rights “particularly rights to privacy and freedom of expression”.
It is surprising and disappointing that the white paper, which sets out measures with far-reaching potential to interfere with freedom of expression, does not contain a strong and unambiguous commitment to safeguarding this right.
Contact: Joy Hyvarinen, Head of Advocacy, [email protected][/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_basic_grid post_type=”post” max_items=”4″ element_width=”6″ grid_id=”vc_gid:1560957390488-02865151-710e-1″ taxonomies=”4883″][/vc_column][/vc_row]