Looking for help with the Online Safety Act - Ofcom consultations? Please get in touch. 


Social media companies and content sharing apps could have to foot the bill for a vast automated copyright protection scheme under the most recent EU proposal to update copyright law. For those who remember, this is Hadopi on steroids. It's a proposal that, history tells us, is unlikely to be workable.

The battle over social media content sharing is moving up a gear as the the European Parliament goes for a major vote on new copyright legislation this September. A single, controversial provision in the propoosed EU Copyright Directive has brought the matter to a head in this latest round of the Hollywood vs Silicon Valley conflict. As currently drafted, it could mean that social media platforms and apps would have to restrict content via an automated copyright protection system - dubbed the "upload filter" - and they could be asked to fund the entire system.

This is Article 13, as proposed within the new EU Copyright Directive, which has just got through the Committee stage of the European Parliament process. The document is known as the Voss report after the rapporteur, Axel Voss.It is due to to go to a First Reading vote by the entire Parliament in a plenary session on 12 September.

Article 13, as drafted in the Voss report, calls for social media companies to enforce copyright by preventing non-licenced or infringing content from being uploaded. It would make them responsible for managing the entire process, including systems implementation and complaints handling. The proposed law has no provision for costs reimbursement and they would be asked to take on liability for action taken incorrectly against legal content.

Article 13 has been the subject of contention since the Directive was first made public in 2016. In its first iteration, it used the word 'prevent the availability' of copyrighted works. These words have been interpreted as reflecting the intention of the provision, which has been fleshed out in more detail in the Voss report. The overall idea of it is to target social media platforms, forcing them into copyright licencing agreements for music and entertainment content and to police their users' uploaded content.

The Voss report text does not use the word 'upload filter' nor does it even use the word 'filter' at all, but it describes a system that leaves no other option for the platform, app or intermediary.

Article 13 of the Voss report refers to a new type of service provider - online content sharing service provider - which would be established by Article 2 of the directive. It then seeks to impose an obligation on these providers to operate a copyright licencing system, and it would mandate them to 'conclude fair and appropriate licencing agreements with rightholders'.

On the face of it, that doesn't sound so bad, but then comes the sting in the tail. The online content sharing service providers would be asked to monitor for any unlicenced content and restrict, block or remove it. If they don't have a licencing agreement, they will be obligated to to ensure that all copyrighted works cannot be accessed via their systems.

The text of Article 13 actually states "appropriate and proportionate measures leading to the non-availability on those services of works or other subject matter infringing copyright or related-rights". 'Non-available' means that the file should not appear on the system. The only only way to make a file 'non-available' is to filter all files as they are uploaded and prevent them from being put online. This means checking every file from every user against a database of digital fingerprints, which will be supplied by copyright holders. Therefore, the 'measures' that should be taken entail a procedure that involves general monitoring of every user's uploads.

This is quite different from the current system, which allows right-holders to notify hosting providers of copyright-infringing content which has already been uploaded. It means therefore a form of prior restraint. (see Macron-May Internet deal: necessary measures or prior restraint? )

Article 13 is deliberately written in a circular manner because the drafters of this legislation know that EU law precludes Member States from asking intermediaries to install a filtering system with the aim of preventing uploading of files. This was determined in the ruling of the European Court of Justice in Scarlet Extended. The ruling makes special reference to a situation where the intermediary is asked to do this for an indefinite period of time and at its own cost. It also issues a reminder that the rights of copyright holders must be balanced against the freedom to conduct business, and the individual rights to free speech and privacy. ( See Sabam v Scarlet: Court rules that ISPs can't be asked to filter ).

This new category of 'online content sharing service providers' is highly problematic. The definition in the draft law says that 'one of the main purposes' of their business is 'to store and give access to the public to copyright protected works uploaded by its users, which the service optimises'.

The text includes a carve out for cloud storage services and online market places, who would potentially be put out of business if they were asked to snoop on their customers' files. It also carves out online encyclopaedias. It is difficult to see how a 'main purpose' of an online encyclopaedia could be to give access to copyrighted works.

The directive asks for the measures to be implemented 'in cooperation with rightholders'. Here is the second sting. As we know from past experience, asking Internet companies to work with rights-holders on this kind of copyright enforcement is highly problematic, and has been proven to be unworkable.

Social media and content sharing platforms would be responsible for full procedural implementation. They would develop the necessary systems, including the handling of the files of digital fingerprints supplied by rights-holders, reporting back to rights-holders, and putting in place a dispute resolution process for their users.

The reason it is unworkable is due to the complexity of the procedures that have to be put in place. The copyright holders are the only people who know what content is copyrighted, so they have to provide databases containing digital fingerprints of their repertoire. The intermediaries have to filter the content against those databases - which requires an element of systems integration - and on identifying an infringing file they have to take action. If they have a licencing agreement, then that would be flagged up. If not, the file would be pulled.

Online content sharing providers would also be asked to establish dispute resolution systems, and to take on the liability for error. Rights-holders are being asked to provide the 'judgement' on whether or not the content was infringing. This was proved to run counter to EU law in 2009 during the debates over the Telecoms Package and a clause was written into telecoms law to prevent this kind of non-judicial assessment. I have written extensively on this ( See Telecoms Package sealed, but not with a kiss and my book The Copyright Enforcement Enigma).

The French attempted a system that became known as Hadopi, after the government office that was set up to oversee it. This system failed, and was ultimately shelved. (See Hadopi budget to be slashed as French review 3-strikes ). The "upload filter" is far more complex than Hadopi, will involve all platforms large and small, across 27 countries, with multiple licencing options, and the administrative difficulties, should not be underestimated.

The Article 13 text (Voss report) calls for 'stakeholder dialogues' to be established by Member State governments in order to make the necessary inter-industry arrangements. Such dialogues have been tried and failed in the past. ( See Commission slams the lid on EU Hadopi talks ).

And then there is the matter of costs. The proposed directive makes no mention of it. However, history tells us that resolving the costs is critical. This was what brought down measures in the UK, targeting broadband providers over peer-to-peer file-sharing of copyrighted content. (See DE Act Costs Order: has Ofcom lost its sense of timing? )

Hence, previous attempts to get Internet intermediaries and rights-holders to jointly operate a copyright enforcement system have proved unworkable.

But there is a crucial difference between the previous attempts and now. Back then, the intermediaries were the network providers, and they fought it out in the courts and in the lobbies. Now, we are talking about mega platforms that comprise some of the worlds largest companies, and which de facto hold sway over the content hosting market globally, and a raft of tiny apps developers, start ups and other innovators. It may suit the monopolistic large platforms to give in, because it serves to reinforce their market power and they can afford it. Smaller players will not be able to.


The legislative process: The Directive may be amended in the plenary vote. Amendments to delete Article 13 may well be tabled. This would seem to be sensible move, as it would enable the rest of the Directive to continue its legislative progress, whilst pulling aside Article 13 for further consideration. It would open up a full and frank debate about how to manage copyright issues on social media platforms and content sharing apps, and it would enable the legal contradictions in this current proposal to be ironed out. But that will not be the end of the process. The Council of Ministers has to have a first reading. There is second and potentially a third reading in the European Parliament.


Further analysis of EU Copyright Directive (Voss Report) Article 13:

Innocenzo Genna: Art.13 of thenew Copyright Directive and the censorship machine, for dummies

Joe McNamee : ENDitorial: The Commission's new filtering adventure

EDRi: Deconstructing Article 13

Felipe Romero-Moreno : 'Notice and staydown' and social media: amending Article 13 of the Proposed Directive on Copyright


Contact me if you would like to discuss any of the issues raised (Via Contact Us page or Twitter @Iptegrity).

For new Iptegrity readers, I have been analysing EU policy for over 10 years ( see 10 years of Internet wars ), and have considerable experience of interpreting EU documents, which are sometimes quite opaque.

In the current political climate of Brexit, I would like to avoid any mis-understanding about my position on the EU. Over the past 10 years as an analyst of EU policy, I have come to value its openness to critical commentary and citizen access at multiple levels. The EU is imperfect, like any political insitution, but relative to others, it has a system that enshrines democratic values and facilitates robust scrutiny of policy proposals. I believe in working with Europe, not against it.

If you cite this article or its contents, please attribute Iptegrity.com and Monica Horten as the author.

Iptegrity moves on!

May 2024: Iptegrity is being re-developed to upgrade the Joomla software.

Please bear with us until the new site is ready.

Find me on LinkedIn

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten. I am an  independent policy advisor: online safety, technology and human rights. In April 2024, I was appointed as an independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. I am a published author, and post-doctoral scholar. I hold a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. I cover the UK and EU. I'm a former tech journalist, and an experienced panelist and Chair. My media credits include the BBC, iNews, Times, Guardian and Politico.

Iptegrity.com is made available free of charge for non-commercial use. Please link back and attribute Dr Monica Horten.  Contact me to use any of my content for commercial purposes.