First thoughts on the UK’s draft Online Safety Bill
After a Green Paper, a White Paper and an interim and final response, on Friday the UK government finally published the first draft of its long-heralded Online Safety Bill.
The Bill aims to tackle a range of forms of illegal and harmful online content through the creation of a new regulatory framework that applies to online platforms, overseen by the UK’s existing communications regulator, OfCom. If passed, it will be the most comprehensive and demanding piece of online content regulation in the world, going far beyond those seen in Germany, Australia and elsewhere.
Throughout the development of the legislation, GPD has raised concerns over potential risks to freedom of expression and privacy online. Despite some tinkering at the edges, the draft Bill published is as troubling as expected and, in some ways, even more so.
The draft Bill is also long and complex, comprising 141 clauses and five schedules spread over 145 pages. As such, we do not attempt to summarise the entirety of the Bill’s provisions in this post. Instead, we highlight the most significant elements of the Bill that we welcome, those we have concerns about, and those which, frankly, we do not entirely understand.
THE GOOD
One welcome aspect of the draft Bill is the move away from a single overarching “duty of care” on platforms to a clearer list of more specific duties around identifying and addressing the forms of illegal and harmful content within the Bill’s scope. These primarily relate to undertaking risk assessments and taking proportionate steps to address risks identified. That being said, these duties are still complex and some (noted below) potentially contradict each other, undermining the benefits brought by this more targeted approach.
We welcome the statutory duty on online platforms to consider users’ rights to freedom of expression and privacy when developing and implementing their policies and procedures. It’s good that OfCom would also be required to consider these rights when developing codes of practice relating to the new duties the draft Bill would impose.
Additionally, we are pleased to see a further statutory duty on online platforms to allow users and affected persons to easily make complaints in relation to the removal of content (as well as to report content). We also welcome the provisions requiring online platforms to be more transparent about their content policies and the measures that are taken to address illegal and harmful content.
THE BAD
Despite these positive aspects, large swathes of the draft Bill pose serious risks to the rights of freedom of expression and privacy.
Significantly, several provisions of the draft Bill would put pressure on large platforms to remove content which is perfectly lawful. The test for what constitutes this type of “lawful but harmful” content is vague: namely, where there are “reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities”. While the government had originally stated platforms would not have to remove any “lawful but harmful” content, the draft Bill places a duty on large platforms to specify how this type of content will be “dealt with”, suggesting that action is expected. This pressure, combined with the vague definition, makes it inevitable that lawful content will be censored.
Even with the provisions around illegal content— against which all platforms will be expected to take action—the scope is overly broad. Rather than being limited to forms of illegal content which have clear legal boundaries (like child sexual abuse material), the duty will apply to any criminal offence in the UK where “the victim or intended victim is an individual”. This “outrageously broad” definition would include criminal offences such as sending messages which are “grossly offensive”, “indecent” or “obscene”. If the government’s proposed Police, Crime, Sentencing and Courts Bill is passed into law, it could even include online speech which is “seriously annoying”.
The draft Bill imposes specific additional duties for online platforms which are “likely to be accessed by children”. In practice, this will mean all online platforms, since a platform will be so considered if it is “possible” for children to access it. These additional duties require platforms to prevent content which is harmful to children from being made available. By creating such a low bar before these additional duties apply, there is a real risk that the entirety of a platform will be moderated to make it child-friendly, meaning the removal of content which is neither illegal nor even harmful to adults.
From a privacy perspective, the draft Bill contains no meaningful safeguards to ensure that encrypted and private communication platforms are not subject to the same requirements as public platforms. Without such safeguards, private channels will need to be monitored, and even providing end-to-end encryption could risk non-compliance with the legislation. As a member of the Global Encryption Coalition, we will continue to defend the availability of strong encryption as a critical means of ensuring all of our privacy and security.
THE DOWNRIGHT CONFUSING
Seemingly in response to concerns following the suspension of Donald Trump from social media, large platforms will have further duties relating to “content of democratic importance”. Specifically, they will be required to consider “the free expression of content of democratic importance” when making content moderation decisions, and to ensure that their policies “apply in the same way to a diversity of political opinion”. The definition of “content of democratic importance” is vague: referring to content which “is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom”. This requirement is not only confusing, but potentially inconsistent with the legislation’s aim to prevent harm, given that it is sometimes political figures who incite violence on social media.
In addition, the draft Bill would establish a set of duties in relation to “journalistic content”, defined as content published by a “recognised news publisher”, or such content when it is shared by a user. Entities considered to be “recognised news publishers” must meet a number of criteria such as whether the entity has as its principal purpose the publication of news-related material, publishes such material in the course of a business, is subject to a standards code and has policies and procedures for handling and resolving complaints. But the entity must be registered in the UK, leaving news publishers from outside the UK without protection, and would also exclude other forms of journalism, such as citizen journalism. It also creates an inconsistent regulatory approach in which the same words would be protected if contained within a news article, but not if an individual person posts them.
NEXT STEPS
The draft Bill will now undergo pre-legislative scrutiny by a committee of the UK Parliament over the next few weeks. Based on the result of the committee’s scrutiny, the government has committed to introducing a revised Bill to Parliament before the end of 2021. GPD, as part of the Save Online Speech coalition, will be calling for significant revisions to the Bill to address our concerns, and ensure that the final legislation fully respects our rights to freedom of expression and privacy.