Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- Author: Monica Horten
- Published: 10 November 2023
New draft rules for Internet platforms have been released this week by Ofcom. The rules tackle a wide range of content which the government now says is illegal. They address unlawful migration, harassment, online fraud, and even public protests.
This is the first in a suite of expected announcements over the next nine months which will tell the Internet and tech industries what content moderation and other actions they are expected to take.
So what is in the announcement?
Ofcom, the UK telecoms and media regulator, is now officially also the online content regulator. These guidelines for tech platforms are something it is required to produce under the UK's new Online Safety Act. They will come into force by late next year or early 2025.
The guidelines set out in detail the compliance requirements for illegal content on online platforms, as per Clause 10 of the Act. Illegal content is defined in Schedules 5,6, and 7 of the Act in the form of a long list of criminal offences. These offences do include rules on tackling child sexual abuse, which related to Schedule 6 of the Act, and have been widely publicised. However, the list includes many other offences. To mention a few, they include unlawful migration under the 1971 Immigration Act, and offences under the 1961 Suicide Act, the Modern Slavery Act and the Public Order Act S.5. There is also a foreign interference offence, that links to the National Security Act 2023.
All platforms offering shareable content will have to follow these new rules. They will be required to proactively detect and remove this content, and they will have to make the determination of illegality themselves.
The guidelines are dismissive of any chilling effect on free speech, which I think is poor coming from Ofcom. There is provision in the Act for balancing against freedom of expression, which is explicitly defined as meaning Article 10 of the European Convention on Human Right.
With six volumes, nine annexes and an overview, the guidelines come to just over 1700 pages. They are currently in draft form, pending a consultation (submission deadline 23 February next year). Ofcom plans to release the final rules next autumn, after which they goes to the Secretary of State and then to Parliament.
A significant issue with these guidelines is that they don't key in to the provisions they are referring to in the law. The actually fail to refer to relevant provisions, which is unhelpful for transparency and impossible for scrutiny. The language is altered, the terms and definitions used are different. For example, U2U looks like a new brand of toiletries, not a reference to a type of online service as defined in law.
Terms are sometimes used interchangeably in a way that is confusing, for example illegal online harms and priority offences. And anyway, there is no definition of illegal harms, and this is not a term used in the Act. Moreover, the guidelines are written without taking account of the provider's perspective, which for a regulator, seems a bit remiss.
Anyone unfamiliar with the law, will struggle to understand what these guidelines are asking them to do. The government's Impact Assessment states that someone paid a junior rate of £20 an hour could familiarise themselves with this law in four hours. Good luck to them!
----
I am an independent policy advisor, working on a freelance consultancy basis. If you need help with your response to Ofcom, please get in touch to discuss your needs.
You are free to cite from this article. Kindly acknowledge Dr Monica Horten as the author and provide a link back.
Here is the link to the Ofcom guidelines. You have until 23 February 2024 to respond to Ofcom on this consultation.
The graphic used above is the Ofcom roadmap for implementation of the the Online Safety Act, obtained from the Ofcom website.
- Article Views: 6105
IPtegrity politics
- What's influencing tech policy in 2025?
- Online Safety and the Westminster honey trap
- Shadow bans: EU and UK diverge on user redress
- EU at loggerheads over chat control
- Why the Online Safety Act is not fit for purpose
- Fixing the human rights failings in the Online Safety Act
- Whatever happened to the AI Bill?
- Hidden effects of the UK Online Safety Act
- EU puts chat control on back burner
- Why did X lock my account for not providing my birthday?
- Creation of deep fakes to be criminal offence under new law
- AI and tech: Asks for the new government
- How WhatsApp holds structural power
- Meta rolls out encryption as political headwinds ease
- EU law set for new course on child online safety
- Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- MEPs reach political agreement to protect children and privacy
About Iptegrity
Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. Former telecoms journalist, experienced panelist and Chair, cited in the media eg BBC, iNews, Times, Guardian and Politico.