Shadow bans: EU and UK diverge on user redress
- Author: Monica Horten
- Published: 18 October 2024
Social media users in the EU can claim redress under the Digital Services Act (DSA) if they are shadow-banned - the practice of making posts invisible but not removing them. It comes after a a recent case in the Dutch courts which established that the DSA applies to shadow banning when the technique is used as a content moderation action. UK users, who are subject to the Online Safety Act, may not be so lucky. The Online Safety Act lacks the procedural safeguards that are built in to the DSA.
Shadow banning practices include delisting of websites in search engines or demotion of posts by the recommender systems that push content into users timelines and news feeds. A shadow ban occurs when a post is actively demoted in the feeds, or is given a low ranking by a search engine so that it almost never shows up. The user who posted the content experiences a ghosting effect where they can still see the content online via their dashboard, but they are getting no traffic and no ‘likes’ or other engagement.
These practices are used by platforms to avoid take-down decisions but they leave users in the dark, not knowing what happened and with no possibility seek redress. Until now, users have also been in a legal no-man’s land, where the law did not address their problem.
The Dutch case examines an instance of search engine delisting on the platform 'X' formerly known as Twitter, in autumn 2023. The claimant, Danny Mekic, a Dutch privacy activist and PhD candidate at Leiden University, argued that his account on 'X' had been delisted in the platform's search engine because of a post criticising the EU’s Child Sexual Abuse Regulation (CSAR). He was able to establish this via followers of his account who could no longer see it online.
This case is discussed in this article in English by Paddy Leerssen, postdoctoral researcher at the University of Amsterdam.
From Leerssen’s article, we understand the court considered the merits of the case on grounds of contractual liability, as well as new content moderation rules under the Digital Services Act (DSA), the EU’s powerful new law to govern social media platforms.
Contractual liability concerns the social media platforms terms and conditions. The issue turns on “discoverability” and whether it is an important element of the service for users. In other words, whether it matters to users that they can be found by others using a search engine function. According to Leerssen’s account, the court determined that “discoverability” is an important function.
The question regarding the Digital Services Act turned on whether or not there would be a requirement to tell users why their content is shadow banned. DSA Article 17 requires social media platforms such as 'X', to provide a statement of reasons when they impose a content moderation restriction.
The DSA definition of content moderation includes actions that “affect the availability, visibility or accessibility”. According to Leerssen’s account, the court had “little difficulty” in concluding that this language describes shadow bans. It follows that shadow bans do fall within scope of the DSA’s content moderation rules and so Article 17 applies.
Hence, social media platforms must provide a statement of reasons when they impose shadow bans. They must also offer a means of redress to users whose content has been subjected to shadow bans. This really matters because until now, users had no means of challenging shadow bans, which can seriously harm their ability to engage on social media.
The effect of a shadow is not very much different from completely taking down or removing the content. As I argue in my paper Algorithms patrolling content: where’s the harm? it is an interference with freedom of expression rights. I wrote the paper before the DSA had been adopted, and on the basis of an earlier text. However, I was able to establish the connection between the language of ‘restricting visibility’ in the DSA and its application to shadow bans. I was also able to assess that social media platforms applying shadow bans would have to provide a statement of reasons, and that users subjected to shadow bans should have access to an effective internal complaints procedure as well as judicial redress.
Unlike the majority of affected users, Mr Mekic did actually get a response from 'X'. It stated that the platform "has automated mechanisms to analyse posts which may be associated with Child Sexual Exploitation (CSE), and subsequently may restrict those posts ’reach. This can cause individual posts associated with an account to be surfaced with temporary account-level restriction". He also got confirmation that his posts had triggered the restriction on visibility of his account, which was subsequently reversed. He was lucky.
Unfortunately, users in the UK are not so lucky. With the UK out of the Single Market, the DSA does not apply. The UK’s Online Safety Act allows for a possible complaint to the social media platforms. It states that users may complain if their content being given "given a lower priority or otherwise becoming less likely to be encountered by other users" and if the content moderation technology was used in a way that breached the provider’s terms of service. It could potentially refer to shadow bans. However, without the procedural and judicial safeguards of the DSA, users who want to mount a challenge may have a tough time.
---
I provide independent advice on policy issues related to online content. Please get in touch via the Contact page.
If you cite this article, kindly acknowledge Dr Monica Horten as the author and provide a link back.
If you like this article you may like to read my paper where I discovered shadow bans and uncovered how they interfere with freedom of expression. If you cannot get access and would like to read it, I have a small number of free copies that I can give away.
- Article Views: 189
IPtegrity politics
- Online Safety and the Westminster honey trap
- Shadow bans: EU and UK diverge on user redress
- EU at loggerheads over chat control
- Why the Online Safety Act is not fit for purpose
- Fixing the human rights failings in the Online Safety Act
- Whatever happened to the AI Bill?
- Hidden effects of the UK Online Safety Act
- EU puts chat control on back burner
- Why did X lock my account for not providing my birthday?
- Creation of deep fakes to be criminal offence under new law
- AI and tech: Asks for the new government
- How WhatsApp holds structural power
- Meta rolls out encryption as political headwinds ease
- EU law set for new course on child online safety
- Online Safety Act: Ofcom’s 1700-pages of tech platform rules
- MEPs reach political agreement to protect children and privacy
- Online Safety - a non-consensual Act
About Iptegrity
Iptegrity.com is the website of Dr Monica Horten, independent policy advisor: online safety, technology and human rights. Advocating to protect the rights of the majority of law abiding citizens online. Independent expert on the Council of Europe Committee of Experts on online safety and empowerment of content creators and users. Published author, and post-doctoral scholar, with a PhD from the University of Westminster, and a DipM from the Chartered Institute of Marketing. Former telecoms journalist, experienced panelist and Chair, cited in the media eg BBC, iNews, Times, Guardian and Politico.