Skip to main content

Online Safety Bill: does government want to snoop on your WhatsApps?

TL;DR They say they do, but the Bill is not clear. The government has been quite shifty in its use of language to obscure a requirement for encrypted messaging services to monitor users' communications. If they do comply with this requirement, they will have to break the encryption that protects users' privacy, and users risk being less safe online. However, they will also be conflicted in their legal duties to protect users' privacy, as will the regulator Ofcom. Private messaging services are important to millions of UK users. Their obligation under the Online Safety Bill needs clarification and amendment.

***UPDATE 24 May 2022 Quietly behind the scenes, there is confirmation that this is exactly what the government wants to do.***

Listening to some of the proponents of the Online Safety Bill, one gets the distinct impression that it includes encrypted messaging services within its remit. This would mean that services like WhatsApp and Telegram and Signal, would be required to follow the same rules as the public social media platforms. It would also include Facebook Messenger, which is integrated with its social media platform. These services would have to monitor users' messages and remove content that does not comply with UK government-defined criteria for content that is either illegal, or legal but harmful to children or adults. (See What's the point of the Online Safety Bill? ).

However, on reading the Bill [1] closely, the position is not clear. If encrypted messaging services were in scope, the Bill would be internally conflicted. It would be simultaneously asking services to interfere with the content that users post, at the same time as it mandates them to safeguard users with regard to privacy and freedom of expression. The Bill gives Ofcom powers to enforce both [Clause 111] backed up by fines. Hence, messaging providers could be fined for protecting privacy, and for failing to protect it.

If we understand Government Ministers correctly, then encrypted messaging is in scope. Home Office Minister Damian Hinds, told the Joint Parliamentary Bill Committee on 4 November last year (bold text is mine):

'you are absolutely right about the central role of private channels and a further trend in that direction. The provisions in the Bill on Ofcom's powers and on the responsibilities of the platform do not change as a result of content being on a private platform, or private part of the platform, versus a public part.' [2]

The reference to private channels suggests encrypted messaging services, although they are not named. It could also refer to other aspects of a social media service, such as private groups.

Mr Hinds emphasised the role of private channels, saying:

'There are three stages. There is public, there is private, and then there is private and encrypted, and therefore impossible even for the platform itself to see. But the responsibilities are the same in each of those cases. The bespoke technology, systems and processes, approaches and solutions may be different, but the responsibility remains'

'Private and encrypted services' is clear. As suggested above, these include Facebook Messenger, Telegram, Signal and WhatsApp. Mr Hinds acknowledges that the service provider cannot read what people are transmitting on an encrypted service. If it is impossible for the the service provider to see the content because it is encrypted, how could they comply with a requirement to monitor and remove content? It is a contradiction.

Cyber-security experts argue that by definition, the provider of an encrypted service would have to break the encryption in order to meet the requirements of the Bill. They further argue that messaging services encrypt content in order to protect people's privacy. If encryption is broken, their privacy is compromised. [3] It leaves users less safe.

Reading through the Bill, it looks like the government has been quite shifty in drafting an overly broad definition, so that encrypted messaging services could be deemed in scope. They've coined the term 'user-to-user' services, defined as 'an internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users' .[Clause 2].

It will be assumed that it refers to social media platforms - the apparent target of the Bill - but it could just as easily be a description of a messaging service. The Bill's definition of a 'regulated service' is done by exemption - it is all user-to-user services except for those that have been specifically exempted in Schedule 1 of the Bill. The exempted services include email, SMS, MMS (mobile phone messaging). Messaging services on online platforms are not mentioned in Schedule 1. This is reiterated in the definition of 'regulated content' which is all user generated content except for those listed, such as comments, SMS, MMS, and one-to-one live aural communications or in other words, a phone call (Clause 49 (2). Once again, encrypted messaging is not listed. The fact that encrypted messaging is not exempt, is interpreted by some legal experts as meaning it is included.

The language of 'publicly or privately', as used by Mr Hinds, slips in to the definition of 'content'. It does so with no clarification : 'anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description; [Clause (189)]. This definition too, could refer to encrypted messaging services, as suggested by Mr Hinds comments to the Bill Committee, and it could just as easily refer to private profiles or groups.

The language of 'public or private' is again repeated in Clause 103(2) b: a service can be required to 'Use accredited technology to identify CSEA content present on any part of the service (public or private), and to swiftly take down that content ' .

The 'accredited technology' refers to content moderation systems, accredited by Ofcom and designed to meet Home Office standards. These systems would monitor and remove child sexual abuse material. If private messaging services fall within scope, the Bill could require a private service to install such a system.

An obligation to safeguard privacy is in Clause 19(3). The contradiction in the Bill is created because Ofcom is empowered to enforce a duty to safeguard privacy [Schedule 4, Section 10(3)] at the same time as it is asked to enforce content moderation[Clause 111]. It must enforce both of these using its quiver of powers. The provider complies with regard to this duty if they follow what Ofcom puts in the Code of Practice [Clause 45(1)].

However, the provision for protection of privacy has changed from the draft Bill of May 2021, and the Bill as Introduced to the House of Commons of 17 March 2022. The draft Bill mandated providers to safeguard users from 'unwarranted infringements of privacy'. The Bill currently before the House, reads as follows: 'the reference to protecting the privacy of users is to protecting users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a user-to-user service or a search service (including, but not limited to, any such provision or rule concerning the processing of personal data)'.>

This would appear to be a weakening of privacy protection for users. It should be diligently scrutinised.

A platform with an encrypted messaging service could therefore be expected to protect users from unwarranted infringements of privacy, and at the same be mandated to scan and monitor its users communications. If the encrypted service was asked to monitor all of its users and the content they post, all of the time, for an indefinite period, and arbitrarily remove their posts, that would surely be a gross breach of that privacy obligation.

The Joint Committee for the draft Online Safety Bill, in its report of 14th December 2021,[4] does discuss encrypted services specifically, and assumes they are in scope.

If this is actually what the government wants to do and they hope to be able to do so with impunity, then that would be a very worrying development. It is in line with a broader assault on rights that is being observed. A present danger is that the Human Rights reform and proposed data protection reforms, would remove the safeguards in current UK law under GDPR, and make is easier to get away with such breaches.

Privacy is also a safety mechanism. It protects against intrusion by the bad actors and the surveillance state. It is quite a chilling thought that a Bill to make people safer could result in their privacy being compromised.

---

Photo : Damian Hinds, Home Office Minister for Security and Borders, giving evidence to the Joint Committee for the Online Safety Bill, 4 November 2021, screenshot by me via Parliamentlive.tv

---

Iptegrity is made available free of charge under a Creative Commons licence. You may cite my work, with attribution. If you reference the material in this article, kindly cite the author as Dr Monica Horten, and link back to Iptegrity.com. You will also find my book for purchase via Amazon.

About me: I've been analysing analysing digital policy for over 14 years. Way back then, I identified the way that issues around rights can influence Internet policy, and that has been a thread throughout all of my research. I hold a PhD in EU Communications Policy from the University of Westminster (2010), and a Post-graduate diploma in marketing. For many years before began my academic research, I was a telecoms journalist and an early adopter of the Internet, writing for the Financial Times and Daily Telegraph, among others.

Please get in touch if you'd like to know more about my current research.

If you liked this article, you may also like my book The Closing of the Net which discusses the backstory to the Online Safety Bill. It introduces the notion of structural power in the context of Internet communications. Available in Kindle and Paperback from only £15.99!

---

[1] Online Safety Bill as introduced to the House of Commons on 17 March 2022

[2] Draft Online Safety Bill (Joint Committee) Oral Evidence transcript 4 November 2021 Q287

[3] Alec Muffett, Why we need #EndToEndEncryption and why it's essential for our safety, our children's safety, and for everyone's future #noplacetohide

[5] Report of Session 2021-22 HL Paper 129 HC 609 Published on 14 December 2021 paragraph 252-258.

  • Article Views: 31585

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users.  Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing.  Former telecoms journalist,  experienced panelist and Chair, cited in the media eg  BBC, iNews, Times, Guardian and Politico.