MEPs reach political agreement to protect children and privacy
- Author: Monica Horten
- Published: 01 November 2023
A political agreement in the European Parliament hopes to resolve the heated discourse over end-to-end encryption and protecting children online. It's a pragmatic way forward that aims to both safeguard against child abuse and maintain confidentiality of communications. It's not yet over the line but from a human rights perspective, it's heading in the right direction.
This agreement was negotiated in the European Parliament's powerful LIBE (Civil liberties) Committee. A leading proponent was the German MEP Patrick Breyer, who issued a press release with details of the agreement, which forms the basis of this post.
It amends the Child Sexual Abuse Regulation (CSAR) tabled by the European Commission in May 2022, and currently making its legislative journey through the European Parliament. The heart of the Commission's proposal is tackling child abuse material online by asking platform providers, including encrypted services, to detect, identify and remove it. The Regulation would mean universal monitoring of private chat platforms via smartphones, potentially using client-side scanning. It has been called out as a form of surveillance.
If adopted, the LIBE Committee's political agreement would overturn the blanket surveillance measures sought by the Commission. End-to-end encrypted services will be explicitly protected. It will however, allow targeted surveillance where there is reasonable suspicion that an individual or group is linked to child sexual abuse. The MEPs who drafted the agreement argue that their proposal is more resilient to legal challenges than the Commission's original draft.
The agreement is yet to be voted but it does have cross-party support, ranging from the conservative side of the Parliament to the progressives. Notably it is supported by the EPP (the largest group), S and D, Greens, Renew, and the Left group. As a long term observer of European digital policy-making, I can say that this level of agreement is is a very positive outcome from a human rights perspective.
Of course, that will not be the end of the story. Following adoption by the European Parliament, the agreement will go to the Council of Ministers, and there will be a negotiation between the institutions. These are the three-way talks known as trilogues (Parliament, Council and Commission). At this stage, the Commission would be an observer. Although it may have influence behind the scenes, the Parliament and the Council would battle it out. The risk for the Council is that it could lose the whole initiative.
The Council has twice postponed a vote on this Regulation, but from what I hear, there are a number of Member States do not agree with it or have reservations. They include Germany, Austria, and Poland. If it is adopted by the European Parliament before Christmas, as hoped, then trilogues would most likely begin after the European elections next May.
As I understand it, the political agreement is designed to improve the balance between the responsibility of the State and providers. Tech companies will be required to build in security by design. Mandatory age verification will not be required, nor will the restriction on under-16s using app stores both are being deleted from the Regulation). Proactive searching of public online platform content for child sexual abuse material will be permitted, however the requirement to do this is addressed to the EU Child Protection Centre (and not providers, as I understand it). Providers will be legally obliged to remove this material when made aware of it. Law enforcement agencies will be obligated to report material to platforms. However, providers will have to avoid collateral blocking of lawful content.
The Child Sexual Abuse Regulation has been led by the European Commission's Directorate for Migration and Home Affairs (DG Home) currently headed by the Swedish Commssioner Ylva Johansson. It appears likely that it was the result of industry capture. The Commission has recently come under pressure because of a close relationship between DG Home and certain well-funded lobby groups, and also for its alleged use of micro-targeted advertising techniques to promote the proposal.
It's not the first time that a reversal like this has happened in EU digital policy. In 2008, the European Commission was captured by entertainment industry lobbyists, and a proposal for copyright enforcement that would have violated freedom of expression was inserted into a new telecoms law. After a lot of pressure, the EU was forced to backtrack, and ever since then, fundamental rights have been central to digital policy-making. You can read all about on this website – just go to the menu section "Telecoms Package".
It does raise serious consequences for the new British law, the Online Safety Act. I will address those in a further article.
---
You are free to cite from this article. Kindly acknowledge Dr Monica Horten as the author and provide a link back.
I provide independent advice on policy issues related to online content. I specialise in interpreting amendments to laws. It was a core element of my PhD methodology and I've been doing it ever since. If you need help with the Online Safety Act please get in touch.
Caveat: it's rare for me to write an article without having seen the actual text, however, I feel that the points raised by this proposal are important for the UK and so I am flagging it up. As soon as I can see the text, I'll do another post.
- Article Views: 7048
IPtegrity politics
- Online Safety and the Westminster honey trap
- Shadow bans: EU and UK diverge on user redress
- EU at loggerheads over chat control
- Why the Online Safety Act is not fit for purpose
- Fixing the human rights failings in the Online Safety Act
- Whatever happened to the AI Bill?
- Hidden effects of the UK Online Safety Act
- EU puts chat control on back burner
- Why did X lock my account for not providing my birthday?
- Creation of deep fakes to be criminal offence under new law
- AI and tech: Asks for the new government
- How WhatsApp holds structural power
- Meta rolls out encryption as political headwinds ease
- EU law set for new course on child online safety
- Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- MEPs reach political agreement to protect children and privacy
- Online Safety - a non-consensual Act
About Iptegrity
Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. Former telecoms journalist, experienced panelist and Chair, cited in the media eg BBC, iNews, Times, Guardian and Politico.
Online Safety
- Online Safety and the Westminster honey trap
- Shadow bans: EU and UK diverge on user redress
- Why the Online Safety Act is not fit for purpose
- Fixing the human rights failings in the Online Safety Act
- Hidden effects of the UK Online Safety Act
- Why did X lock my account for not providing my birthday?
- Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- Online Safety - a non-consensual Act
- Online Safety Bill passes as US court blocks age-checks law
- Online Safety Bill: ray of hope for free speech
- National Crime Agency to run new small boats social media centre
- Online Safety Bill: does government want to snoop on your WhatsApps?
- What is content of democratic importance?
- Online Safety Bill: One rule for them and another for us
- Online Safety Bill - Freedom to interfere?