Big tech accountability? Read the backstory to today's policy debates here on Iptegrity.

Online Safety Bill debate in the House of Commons  - Westminster Parliament

Image: an empty House of Commons debating the Online Safety Bill on 12 September 2023.

TL:DR On the same day as the UK Parliament approved the Online Safety Bill (soon to be Online Safety Act), a US court  blocked a law to protect children when they access the Internet on grounds that it violates the First Amendment of the US Constitution which protects free speech.

When you run the Online Safety Bill  through the mirror of this US court ruling, you get a remarkable set of findings. Whilst US free speech law operates differently from UK law, it is very likely that the Online Safety Bill will also be in breach of freedom of expression under the Human Rights Act which enshrines the European Convention on Human Rights. This article takes a comparative look.

Read more...

The Online Safety Bill has had a difficult relationship with freedom of expression as its main premise to to remove content. For that reason it was a pleasant surprise to see the House of Lords amend the Bill with explicit support for free speech as a right under the Human Rights Act and European Convention on Human Rights (ECHR). 

Until now, this support has been missing from the Bill.  This is therefore a positive outcome from the House of Lords which will  redress the balance between content removal and free speech. 

Read more...

Tweet from UK  Prime Minister Rishi Sunak on agreement with social media platforms on small boats content

TL;DR: The government is to establish an £11 million Online Capability Centre to seek and identify small boats content from people smugglers. The centre will be run by the National Crime Agency in co-operation with social media platforms.

In order to protect freedom of expression, the government must precisely identify the specific content to be removed. The examples published by the government on Twitter / X give some clues. It is not a technological silver bullet to solve the question of people arriving on UK shores in small boats.

And it is concerning for British democracy to have law enforcement working so closely alongside the companies who run our public conversational spaces, with the power to restrain publication, and no independent oversight.

TO READ THE FULL ARTICLE, CLICK HERE:

Read more...

I'm delighted that my  paper 'Algorithms patrolling content: where's the harm?' An empirical examination of Facebook shadow bans and their impact on users' has been published in the International Review of Computers, Law and Technology. 

 It has been a lot of hard work to get this to publication, but now that it's out, I hope it will inform academics, students and policy-makers about an obscure aspect of content moderation, that has a very real impact on individuals who are active on social media.  The ghosting experience of their Pages and accounts by shadow banning is not soft enforcement option but a significant interference with their freedom of expression.

It is making its way into law and policy with hardly a blink of the eye as policy-makers adopt the belief that regulating 'behaviour' is a good way to deal with harmful content. Yet, as I argue in the paper, suppressing  the dissemination of content on the basis of the account 'behaviour' can interfere with freedom of expression to almost the same extent as taking it down. It takes no account of whether the content is lawful. If there is no requirement to notify the user, then how are they even going to be able to appeal such a restriction on their rights? 

Read more...

TL;DR They say they do, but the Bill is not clear. The government has been quite shifty in its use of language to obscure a requirement for encrypted messaging services to monitor users' communications. If they do comply with this requirement, they will have to break the encryption that protects users' privacy, and users risk being less safe online. However, they will also be conflicted in their legal duties to protect users' privacy, as will the regulator Ofcom. Private messaging services are important to millions of UK users. Their obligation under the Online Safety Bill needs clarification and amendment.

***UPDATE 24 May 2022 Quietly behind the scenes, there is confirmation that this is exactly what the government wants to do.***

Read more...

TL;DR A puzzling feature of the UK Online Safety Bill is the special protection it gives to 'content of democratic importance'. It asks the large online platforms to give special treatment to such content, in cases where they are taking a decision to remove the content or restrict the user who posted it. However, the term appears to have been coined by the government for the purpose of this Bill, and what it means is not clear. There is no statement of the policy issue that it is trying to address. That makes it very difficult for online platforms to code for this requirement.

Read more...

TL;DR The UK government's Online Safety Bill creates a double standard for freedom of expression that protects large media empires and leaves ordinary citizens exposed. It grants special treatment to the large news publishers and broadcasters, who get a carve out from the measures in the Bill so that headlines like the notorious "Enemies of the People" get special protection from the automated content moderation systems. They even get a VIP lane to complain. Foreign disinformation channels would also benefit from this carve-out, including Russia Today. Content posted by ordinary British people could be arbitrarily taken down.

Read more...

TL;DR Social media companies will be required by the government to police users' posts by removing the content or suspending the account. Instead of a blue-uniformed policeman, it will be a cold coded algorithm putting its virtual hand on the shoulder of the user. The imprecise wording offers them huge discretion. They have a conflicted role - interfere with freedom of expression and simultaneously to protect it. Revision is needed to protect the rights of those who are speaking lawfully, and doing no harm, but whose speech is restricted in error.

Read more...

Draft Online Safety Bill committee 4November2021

TL;DR Key decisions will be taken behind Whitehall facades, with no checks and balances. The entire framework of the Bill is loosely defined and propped up by Henry VIII clauses that allow the Secretary of State (DCMS and Home Office) to implement the law using Statutory Instruments. This means that Ministerial decisions will get little or no scrutiny by Parliament. This will include crucial decisions about content to be suppressed and compliance functions required of Internet services. Standards for automated detection of illegal content will be determined by the Home Secretary. The concern is whether these powers could ever be used to block lawful but inconvenient speech.

Read more...

TL;DR The government's Impact Assessment calculates that this Bill will cost British businesses over £2billion to implement. By its own admission, 97 per cent of the 24,000 businesses in scope, are a low risk of having illegal or harmful content on their systems. Only 7-800 are likely to be high risk, and the real target, the big global platforms, only number around half a dozen. It is hard to see how the draft Bill of May 2021 could be justified on this basis. The Bill should focus on the real aim of tackling the global mega-platforms, and the high risk issues like child sexual abuse. For 97 per cent of the 24,000 small British businesses, there is no evidence that they entail any risk and the cost and regulatory effort is disproportionate to the aims.

Read more...

TL;DR A website blocking order is a modern form of censorship. In the wrong hands, it is a dangerous weapon. Blocking orders provided for in Clauses 91-93 of the Online Safety Bill could be used in the most egregious cases to block overseas Internet services that refuse to comply with the Bill. They are not suitable for targeting 'big tech' social media platforms. Blocking orders have been used in the UK for copyright enforcement since 2011, and there is a body of caselaw to draw on. If these orders are used, they should be precise and specify the exact locations of the content, site or server to be blocked.

Read more...

Nadine Dorries, Secretary of State, 4 November 2021, screenshot from Parliamentlive.tv

Tl;DR The Online Safety Bill is a major piece of legislation intended to tackle the very difficult and troubling issues around social media. However, in its desire to remove the bad stuff, the Bill is setting up a legal and technical framework that mandates and enforces the automated suppression of online content and social media posts. The lack of a precise aim has enabled it to be moulded in a way that raises a number of concerns. Government Ministers will have unprecedented powers to define the content to be removed. They will be able to evade Parliamentary scrutiny through the use of Secondary Legislation. Social media platforms will have a wide discretion to interpret the rules and to determine whether content stays up or goes down. These factors, combined with the overall lack of precision in the drafting and the weak safeguards for users, means that the Bill is unlikely to meet human rights standards for protecting freedom of expression online.

UPDATED to reflect the Bill as Introduced to the House of Commons on 17 March 2022.

Read more...

UK intelligence services have been taking advantage of gaps in the international rules to conduct bulk interception of Internet traffic. That practice came under scrutiny in the European Court of Human Rights, in a ruling that was released this week.

The case of Big Brother Watch and Others v the United Kingdom was brought to the Court by human rights activist groups who were concerned about the mass online surveillance being carried out by UK intelligence services. It has resulted in a ruling that lays out essential ground rules for protecting privacy.

Read more...

As talks on a UK-EU post-Brexit trade deal enter their tense final stages, a vital agreement on security co-operation is hanging in the balance. A bespoke proposal has been tabled by the EU. It would facilitate ongoing access to cross-border data that police and intelligence services need. If it cannot be agreed, there are serious risks for law enforcement and individual privacy. A reluctance on the part of the UK government to commit to future support for the European Convention on Human Rights puts it in jeopardy.

The security co-operation agreement is needed so that UK law enforcement

Read more...

The Closing of the Net (Polity Press 2016)


"takes the pulse of the open web" Journal of IP Law & Practice


PAPERBACK & KINDLE FROM £15.99

If the open Internet is an essential precondition for democracy, should governments or corporations be allowed to restrict it? This is the question at the heart of my book 'The Closing of the Net' and it discusses the backdrop to the political controversies of today around such issues as fake news, terrorism content online, and mis-use of data - controversies that result in calls for 'responsibility' by online companies. The book argues that any regulation of these companies must enshrine public interest criteria, which must balance the competing rights at stake.

Read more...

dr.monica.horten.moldova.ict.summit.april2016.crop.jpg

Find me on LinkedIn

About Iptegrity

Iptegrity.com is the website of Dr Monica Horten.

I am a tech policy specialist, published author, post-doctoral scholar. I hold a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. Currently working on UK Online Safety Bill.

Recent media quotes: BBC, iNews, Times, Guardian, Politico.  Panelist: IAPP,  CybersecuritySummit. Parliament and Internet. June 2022-July 2023 w/ Open Rights Group.

Iptegrity.com is made available free of charge for non-commercial use. Please link back and attribute Dr Monica Horten.  Contact me to use any of my content for commercial purposes.  

The politics of copyright

A Copyright Masquerade - How corporate lobbying threatens online freedoms

'timely and provocative' Entertainment Law Review