Platforms’ algorithms and recommender systems are dangerous for democracy

Here is what we could do about it

Published: 5 April 2022

“The EU Digital Services Act can be a powerful tool in protecting social media users by default and empowering them to exercise real control over their data and the information they see.” – Dorota Głowacka, Panoptykon, Poland

For citizens to participate fully in democratic processes, they need to be able to make informed decisions. In the last decades, the internet made information more accessible than ever. Paradoxically, however, we seem to be getting less and less information of public interest while being shut into online echo chambers that reinforce our views and prevent us from holding healthy and fruitful dialogues. Platforms that use algorithms and recommender systems have a fair share of responsibility in this situation.

In February 2021, Panoptykon, a Civitates grantee partner, started a project focused on better understanding social media optimisation algorithms – systems used by large platforms to deliver behavioural advertising and recommend content to users – to develop evidence-based policy recommendations addressing the harms caused by these systems.  They partnered with a data scientist from Northeastern University, Boston, MA, and conducted an experiment which demonstrated how Facebook uses algorithms to deliver personalised ads that may exploit users’ mental vulnerabilities. It showed that users are unable to get rid of disturbing content: disabling sensitive interests in ad settings limits targeting options for advertisers but does not affect Facebook’s own profiling and ad delivery practices. 

Dorota Głowacka, a lawyer at Panoptykon explains that:

Large online platforms have become key channels through which people access information and experience the world. But the content they see is filtered through the lens of algorithms driven by commercial logic that maximises engagement to generate even more data about the user for the purposes of surveillance advertising. This automated fixation on campaign targets is indifferent to ‘collateral damage’: amplification of hate or disinformation, or – as this case study shows – reinforcement of trauma and anxiety.

“Our study was based on the case of a woman who approached us; she received disturbing ads about serious illnesses and health sensitive subjects. She was concerned with health herself, but suddenly noticed a very large portion of ads on Facebook related to health and illnesses which fueled her anxiety. We discovered that Facebook had linked to her profile over 20 “interests” that were illness related. She never indicated these interests herself, Facebook had interpretated them as a result of her online activities, both on and off the platform. In cooperation with a specialist in the analysis of Facebook user activities, we monitored her account which after some time confirmed that the woman was getting a disproportional number of ads related to health.

We then decided to find out what would happen if we would delete these interests from her account. We noticed that even though we removed her interests, her experience did not significantly change as she received almost as much illness related ads as before. This means that the control that Facebook offers to its users, is useless. The ad optimisation will still deliver the ads that the system believes are right for the person – ‘right’ meaning that it could provoke the users’ engagement, but it is not necessarily a positive experience for the user.”

Panoptykon’s case study shows that social media users are helpless against platforms that exploit their vulnerabilities for profit, but it is not too late to fix this. In fact, as the watchdog organisation puts it, “The EU Digital Services Act can be a powerful tool in protecting social media users by default and empowering them to exercise real control over their data and the information they see.

The project received prominent coverage in the national and international media outlets such as Financial Times and was used as a basis for advocacy. While the Digital Services Act was in front of the Internal Market and Consumer Protection (IMCO) committee of the European Parliament, Panoptykon coordinated an open letter to the committee, signed by over twenty civil society organisations, with demands to “ensure effective oversight of the algorithms used by large online platforms that shape our experience of the world”. Several Civitates’ grantee partners singed the letter which got attention by Brussels-based media such as Politico and Euractiv.

Dorota shares that for Panoptykon an ideal solution for the negative consequences of algorithms is to ban the use of inferred personal data for advertising purposes by default. “For different reasons this is impossible to gain at the moment, so we have come up with a compromise; together with 49 civil society organisations from all over Europe, we called on the members of the European Parliament’s to adopt amendments  to the Digital Services Act (DSA) proposal that would empower users, ensure effective oversight of algorithms and provide limitations to surveillance-based ads. This would at least protect users from the most intrusive platforms’ advertising practices, often relying on exploiting users’ sensitive features and vulnerabilities and leading to manipulation.

While the current debate on the draft DSA proposal presented by the European Commission largely focuses on issues related to user content moderation, we think that – though important – these are less inconvenient measures for platforms because they do not challenge their surveillance-based business model or affect their attention-maximising algorithms. The DSA can and should restrict the possibility of online platforms to use algorithmic predictions for advertising and content recommendations. To avoid manipulatory design, nudging users to give their consent to these harmful practices, and ineffective control tools, users should be able to use interfaces independent from the platform, including alternative content personalisation systems built on top of the existing platform by commercial or non-commercial actors whose services better align with users’ interests.”

Some of of the recommendations made by Panoptykon were included as amendments to the DSA and the study also contributed to the European Commission’s proposal to distinguish between targeting and optimisation as part of the political advertising legislation presented in November 2021.

Panoptykon’s case study also received positive reactions from other organisations working on technology which contributed to the general understanding of the functioning and the role of optimisation algorithms among civil society advocates who could use it in their own advocacy positions or as a basis of new research avenues.

The Panoptykon Foundation was established in April 2009 upon the initiative of a group of engaged lawyers, to express their opposition to surveillance.