Skip to main content

Fixing the human rights failings in the Online Safety Act

Keir Starmer’s strong commitment to keep the UK in the European Convention on Human Rights is very welcome. As he and his government settle in to their new roles, we respectfully remind  them of one piece of law, made under the last government, that may need some attention: the Online Safety Act.

The Online Safety Act was passed into law in October 2023. It was an Act that fell victim to all the failings of the last government. Poorly drafted, lack of clarity, skeleton clauses and a disdain for the European Convention on Human Rights.  As such, human rights compliance was scarcely referenced. Only an 11th hour amendment by the cross-bench peer Lord Hope of Craighead,  ensured that Article 10 of the Convention, the right to freedom of expression, was explicitly referenced in the Act [see Online Safety Bill: ray of hope for free speech].

The Online Safety Act does not establish Article 8 ECHR, the right privacy, as an over-arching principle. Indeed,  the Act denotes privacy as a mere breach of law that is relevant to the service. It matters because of the number of measures that effectively require some form of screening or monitoring.  Article 8  is of course a matter of public law, and as  such Ofcom is bound by it. However,  this would not be clear to anyone trying to comply with law and it would be strengthened by including Article 8 explicitly.

A Labour amendment to insert judicial review of a key provision was dropped under political pressure. Private actors are given sole discretion to take decisions on criminality and illegality: they will do this on the basis of data points gathered by AI systems. There are no procedural safeguards for individuals whose content is wrongly targeted by the measures in the Act. Keir Starmer will need no explanation as to why this needs fixing.

For an Act that deals with people’s speech, that  is a serious omission. Article 10 protects the right to free speech and   it governs the process for instances when the State wants to restrict speech, which is what this Act is all about.   The entire premise of this Act is to curtail speech which the government proscribes. In that regard, it immediately engages Article 10. This is regardless of whether the speech is considered to be legitimate or unacceptable. Article 10 kicks. It’s role is to guide the State on the process.

There are 130 criminal offences  listed in the Online Safety Act. These offences  signal the proscribed content. The Act requires private actors to interpret these offences, to decide whether the content posted is legal or illegal, and take action. The required actions usually mean  removing the content or locking the account. In technical terms, this is usually filtering and blocking. The latest technology uses artificial intelligence and machine learning tools.

It has long been established, including in case law,  that filtering and blocking measures do engage Article 10. They also engage Article 8  - these measures are a form of snooping. This was all  ignored by the last government under pressure from vested interests. It is time to put it right.

Lord Hope of Craighead’s amendment (Clause 236 Online Safety Act)* was good to see and it does at least help to establish Article 10 as an over-arching principle of the Act. There are some other fairly easy changes that would welcome :  

-  Prohibit any form of general monitoring mandate. General monitoring  means the continuous checking  and filtering of all  content on the system 365 days, 24 hours, without any limit to the duration. Governments should not impose such a mandate on networks, systems or platforms because it is inherently a form of real-time surveillance, and violates Article 8  This was a Labour amendment to the Bill as it was passing through Parliament. It was dropped due to lack of support from the previous government. It would be appropriate for the new government to show its support for the ECHR.

-  Judicial review of measures in S. 121 that could impact on end-to-end encrypted services. This was also a Labour amendment during the Parliamentary process, and would strengthen human rights compliance.

-  Procedural safeguards for individuals affected by the measures in the Act. This would go some way to ensuring fairness when wrongful decisions are made by online platforms. Currently, it is up to Ofcom, the regulator to put something into its Codes of Practice (Clause 49).  It is the duty of the government to establish these safeguards in law.  

None of this prevents the government from tackling the serious issues around abusive and violent  content, sexual abuse, and criminal acts. However, it does provide a framework to ensure that the measures in the Act are taken fairly. 

---

 

*The amendment: freedom of expression”: any reference to freedom of expression (except in sections 41(6)(f) and 78(2)(d)) is to the freedom to receive and impart ideas, opinions or information (referred to in Article 10(1) of the Convention) by means of speech, writing or images;"

 

You might also like AI and tech: Asks for the new government

---

If you cite this article, kindly acknowledge Dr Monica Horten as the author and provide a link back.

I provide independent advice on policy issues related to online content, Please get in touch via the contact page.

 

  • Article Views: 450

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users.  Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing.  Former telecoms journalist,  experienced panelist and Chair, cited in the media eg  BBC, iNews, Times, Guardian and Politico.