National Crime Agency to run new small boats social media centre
- Author: Monica Horten
- Published: 08 August 2023
TL;DR: The government is to establish an £11 million Online Capability Centre to seek and identify small boats content from people smugglers. The centre will be run by the National Crime Agency in co-operation with social media platforms.
In order to protect freedom of expression, the government must precisely identify the specific content to be removed. The examples published by the government on Twitter / X give some clues. It is not a technological silver bullet to solve the question of people arriving on UK shores in small boats.
And it is concerning for British democracy to have law enforcement working so closely alongside the companies who run our public conversational spaces, with the power to restrain publication, and no independent oversight.
TO READ THE FULL ARTICLE, CLICK HERE:
The Prime Minister's announcement this week of an agreement with leading social media platforms on content promoting small boat Channel crossings, illustrates how the Online Safety Bill can be politically manipulated by the government to push its own political agenda.
The government has done a deal with Meta, TikTok and X/Twitter to remove social media promotions by people smugglers who organise channel crossings for asylum seekers. The deal is that they will "voluntarily" remove these promotions. They will co-operate with the National Crime Agency (NCA) which will notify them what to take down. The NCA will be bolstered with new staff and resources to carry out this task.
Behind the sabre-rattling, populist rhetoric about reining in big tech and clamping down on illegal immigration, the UK government is setting up an £11 million (GBP) Online Capability Centre to identify social media posts from people smugglers. The aim is to move into AI-based detection of this content by law enforcement. This is being done in preparation for the Online Safety Bill, anticipated to coming into force this autumn, when platforms will be required to "prevent" people coming into contact with these posts.
The National Crime Agency will be hiring technical specialists to help build an understanding of the scale of people smugglers content online, and develop its on intelligence capabilities in this area. It will seek out and identify the content, before notifying social media platforms to remove it.
This suggests the agreement with Facebook, TikTok and Twitter, has been done under the current law: online platforms are required to act on reports of illegal content, notifying them of the precise content to be removed. It is a similar template to voluntary agreements done with social media companies in the past, for example on copyright.
But that will change with the Online Safety Bill, which defines "assisting illegal immigration" and "unlawful entry to the UK" as illegal content to be removed by social media platforms. They can be required to proactively seek out this content at their own initiative, and preventatively remove it.
This preventative action could be done as users are uploading it, so it never reaches the platform, creating new legal dilemmas as outlined by Dan Squires KC in this legal opinion. It up-ends the existing legal regime. Online platforms will be expected to determine illegality for themselves, and remove it before it is published. Users who have acted lawfully will have little opportunity to appeal.
The Prime Minister's office has published on social media some examples of the content it expects them to remove. It does show the type of images and messages that it wants removed, which give some clues to platform moderator. However, more precision is needed.
It is one my ongoing criticisms of the Online Safety Bill that the illegal content to be removed is so poorly defined that platforms will not know what to take down and will inevitably interfere with freedom of expression by removing lawful posts. In order to protect freedom of expression, social media companies need a clear definition of what the government regards as illegal, otherwise there will be serious question marks around the decision-making process.
We must also question how much power this hands to government over an important area of public debate. From a free speech perspective, the prospect of law enforcement working so closely with the companies who run our public conversational spaces, with no independent oversight, is deeply worrying for our democracy.
And we should not be mistaken. Seen in the wider context, loss of border controls has in part happened because of loss of access to the Schengen Information System ( SIS II), which I've commented on previously. This is a direct consequence of decisions taken in negotiating the treaty with the EU, where the government chose not to have a security co-operation agreement. Taking down social media content is not a technological silver bullet to solve the issue of people arriving from across the Channel in small boats.
---
You may cite from this article with attribution to Dr Monica Horten, Iptegrity.com and provide a link back.
- Article Views: 9120
IPtegrity politics
- What's influencing tech policy in 2025?
- Online Safety and the Westminster honey trap
- Shadow bans: EU and UK diverge on user redress
- EU at loggerheads over chat control
- Why the Online Safety Act is not fit for purpose
- Fixing the human rights failings in the Online Safety Act
- Whatever happened to the AI Bill?
- Hidden effects of the UK Online Safety Act
- EU puts chat control on back burner
- Why did X lock my account for not providing my birthday?
- Creation of deep fakes to be criminal offence under new law
- AI and tech: Asks for the new government
- How WhatsApp holds structural power
- Meta rolls out encryption as political headwinds ease
- EU law set for new course on child online safety
- Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- MEPs reach political agreement to protect children and privacy
About Iptegrity
Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. Former telecoms journalist, experienced panelist and Chair, cited in the media eg BBC, iNews, Times, Guardian and Politico.