Looking for help with the Online Safety Act - Ofcom consultations? Please get in touch. 

Draft Online Safety Bill committee 4November2021

TL;DR Key decisions will be taken behind Whitehall facades, with no checks and balances. The entire framework of the Bill is loosely defined and propped up by Henry VIII clauses that allow the Secretary of State (DCMS and Home Office) to implement the law using Statutory Instruments. This means that Ministerial decisions will get little or no scrutiny by Parliament. This will include crucial decisions about content to be suppressed and compliance functions required of Internet services. Standards for automated detection of illegal content will be determined by the Home Secretary. The concern is whether these powers could ever be used to block lawful but inconvenient speech.

UK government Ministers will have unprecedented powers over online speech if a new Bill set to go before Parliament this year, ever becomes law. The Online Safety Bill [1] is a major piece of legislation intended to tackle the very difficult and troubling issues around social media. However, in its desire to remove the bad stuff, the Bill is setting up a regime where Ministers could make all the key decisions behind closed doors.

It is gives Ministers broad discretion to decide what content should be removed and ways in which Internet services will have to comply. They will do this after the Bill (currently in draft form) has been passed into law by Parliament. Effectively, Parliament is being asked to pass a law that consists of little more than bullet points. Ministers will fill in the required actions. Parliament will have little, if any, opportunity to scrutinise their decisions. Debates are only likely if there is significant opposition, and amendments will not be possible.

The Bill is sponsored by two Ministries: the Department for Digital, Culture, Media and Sport (DCMS), and the Home Office. DCMS has the lead, and to some extent this has overshadowed the role of the Home Office and its potential to influence the Bill's outcomes. This analysis is based on the draft Online Safety Bill of May 2021.

Ministers will set the strategic policy priorities for the Bill, which they are asked to set out in a statement. Ofcom, the regulator that has been tasked with overseeing the Bill, must comply, and explain how they propose to meet those priorities and after 12 months, report back on what they did (Clauses 143,144, and 78).

This in itself it very serious. Ofcom is the regulator, and is duty bound to act independently of government. Asking it to take orders from government Ministers sets up a path to compromise that independence.

Then there is Article 40 (1) which would allow Ministers to amend Codes of Practice drafted by Ofcom to ensure they reflect government policy. The Joint Parliamentary Committee that has examined the Bill, has already jumped on this [2]. It has recommended that this clause be removed. However, that would not remove all of the Ministerial powers to set policy for this Bill, because the powers are spread throughout the text.

Ministers will be able to determine the services that are in scope (Clause 174), the classification of providers (Schedule 10), and the definitions of the content to be addressed (Clauses 53, 54, 55).

This will be done without checks and balances, by means of powers granted to Ministers to amend the legislation after Parliament has passed it. These powers are often referred to as 'Henry VIII powers, because they can enable Ministers to arbitrarily change the law. These powers are referenced in the Bill using the language 'making Regulations'.

The Regulations will be set out in what's known as 'Secondary Legislation', typically using a Statutory Instrument. There is little or no scrutiny by Parliament. It can be adopted via the negative procedure, which means that is automatically becomes law on the day the Minister signs it, unless Parliament agrees within 40 days to reject it.[3] Hence, no scrutiny is possible. Alternatively it can be done under the affirmative procedure, by a resolution passed by both Houses. The affirmative procedure gives Parliament some opportunity to examine the text, but it would not be as strong as the scrutiny for an Act, and does not allow for amendments. The Joint Bill committee has recommended that the Statutory Instruments should be under the affirmative procedure.

On services, Ministers will be able to change the rules on those that are currently exempt from being asked to comply with the Bill. For example, in the draft Bill, comments from users below published articles are exempt from complying. So are one to one voice calls, and also SMS services. Using the Henry VIII powers, Ministers could change these exemptions without any checks and balances (Clause 174).

Ministers will make regulations in the form of a Statutory Instrument to define the service categories (Schedule 10). The Bill classifies Internet services into three categories - Category 1, Category 2, and Category 2a. The regulations will set the threshold number of users and functionalities. Category 1, for example, is expected to be the big global social media platforms such as Facebook, Twitter, Instagram, Snapchat, and TikTok, but there are suggestions that it could also include dating services. These platforms will be given more responsibility than the other two. Hence, this categorisation matters.

Ministers may also amend the systems and processes that the Internet services have to implement and operate in order to comply with the Bill. These requirements are confusingly under the heading of "Online Safety Objectives". I use the word confusing because they do not define any objectives, aims or goals, but they do set out a task list that would be at home in a project management plan. This Bill asks the Internet platforms to design and operate a service which has appropriate systems and process for compliance and risk management, which should be adequate to support UK users, and that UK users will be made aware of the terms of service. Ministers have the powers to amend these requirements via regulations. (Schedule 4, clauses 7 and 8).

Ministers will flesh out the definitions of illegal content (41), content harmful to children (45) and content harmful to adults (46). This is of considerable concern. We must bear in mind that this is content that will be suppressed by social media platforms, and therefore it matters to everyone, because it engages the right to freedom of expression. This right which must be balanced against a decision to remove posts, and these decisions are not always straightforward.

The categorisation of harmful to children and harmful to adults (legal but harmful) raises complex issues. This wording refers to content that is lawful, but is subjectively determined to be harmful. The Bill provides a broad and vague definition: 'presents a material risk of significant harm to an appreciable number of adults in the United Kingdom'. It says nothing about the type of content that it expects to be addressed. It does however, give a broad discretion to the social media platforms in taking down this content if they 'have reasonable grounds to believe' it meets the criteria.

Government Ministers will fill in the criteria after the law has been passed. Parliament will therefore be asked to pass a law without being able to assess the full implications of what the law will do. Moreover, as the removal of this content will be at the discretion of the Internet companies, and incorrect removals will violate freedom of expression rights, the definitions of this content are critical.

Illegal content in the Bill means terrorism content and child sexual abuse material, both of which relate to criminal offences under existing law. Illegal content also refers to a list of offences in Schedule 7, and to new Communications Offences in UK law set out in Part 10 of the Bill. Ministers may amend these schedules by statutory instrument, but they are constrained in their ability to add new offences to the list (Clause 176) . The inclusion of these offences on the face of the Bill appears to be in response to the Bill Committee, which was concerned that Parliament would have no opportunity for scrutiny. It forced a turn-around by DCMS, as indicated in a media release issued on 4 February. However, the danger remains that new offences could be slipped through in future with little or no Parliamentary scrutiny.

Finally, Ministers will be able to set the standards for automated detection of terrorism and child sexual abuse content (Clause 105(9) and 105(10)). It will be the Home Secretary that will publish these standards. They are required for the Ofcom 'accredited' software that is mandated in the Bill. This software is to be imposed on Internet services that have not done a good enough job, in the opinion of Ofcom. It's a tactic that is more likely to be associated with autocratic regimes than a democracy.

This matters a lot. In the case of terrorism content, there are instances where the context for a post can make a difference as to whether the content should or should not be taken down. These decisions should not be left to automated systems alone, nor to private providers, and it is not certain whether they would welcome the imposition of UK government mandated software.

Currently, the UK government is currently in crisis mode. The concern is how these powers could be used maliciously, to block lawful but inconvenient speech.


Photo: Draft Online Safety Bill Joint Committee sitting on 4 November 2021, with Secretary of State for Digital, Culture, Media and Sport (DCMS) Nadine Dorries, and Damian Hinds, Minister of State (Minister for Security and Borders), Home Office. Screenshot from https://parliamentlive.tv/ taken by me..


Iptegrity is made available free of charge under a Creative Commons licence. You may cite my work, with attribution. If you reference the material in this article, kindly cite the author as Dr Monica Horten, and link back to Iptegrity.com. You will also find my book for purchase via Amazon.

About me: I've been analysing analysing digital policy for over 14 years. Way back then, I identified the way that issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing. For many years before began my academic research, I was a telecoms journalist and an early adopter of the Internet, writing for the Financial Times and Daily Telegraph, among others.

Please get in touch if you'd like to know more about my current research.

If you liked this article, you may also like my book The Closing of the Net which discusses the backstory to the Online Safety Bill. It introduces the notion of structural power in the context of Internet communications. Available in Kindle and Paperback from only £15.99!


[1] Online Safety Bill as introduced to the House of Commons on 17 March 2022

[2] Draft Online Safety Bill Joint Committee Report of Session 2021-22

[3] See the UK Parliament website for an explanation of the negative procedure

Iptegrity moves on!

May 2024: Iptegrity is being re-developed to upgrade the Joomla software.

Please bear with us until the new site is ready.

Find me on LinkedIn

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten. I am an  independent policy advisor: online safety, technology and human rights. In April 2024, I was appointed as an independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. I am a published author, and post-doctoral scholar. I hold a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. I cover the UK and EU. I'm a former tech journalist, and an experienced panelist and Chair. My media credits include the BBC, iNews, Times, Guardian and Politico.

Iptegrity.com is made available free of charge for non-commercial use. Please link back and attribute Dr Monica Horten.  Contact me to use any of my content for commercial purposes.