Better transparency tools to put pressure on platforms

“While it is clear that algorithmically driven intermediaries shape public discourse, we do not know the exact details. That is why transparency is one of the most important regulatory goals,”  Mackenzie Nelson, project manager of ‘Governing Platforms’ at AlgorithmWatch.

In order to tackle the influence of intermediaries such as Facebook, Google, Twitter and YouTube on the public discourse, several regulatory attempts are in the making, both at EU level and in the EU countries. With the project ‘Governing Platforms’ funded by Civitates, AlgorithmWatch and partners are developing innovative governance proposals to feed into the EU and the individual countries’ policy processes.

Mackenzie Nelson is project manager at AlgorithmWatch and coordinates ‘Governing Platforms’.

How big is the influence of intermediaries on the public discourse in Europe?
A communications study that we commissioned made it clear that one of the main challenges in conducting research in this field is that the data that researchers need to gather evidence isn’t available. Our second study, which just like the first one came from our partners at the Mainz Media Institute, built on some of these findings. It argues that transparency is one of the most important regulatory goals. At the moment, it is clear that algorithmically driven intermediaries shape public discourse, but how, to what extent and under which conditions is still really difficult to assess. This hinders the work of civil society organisations who are tasked with monitoring problems like hate speech or disinformation. They need better transparency tools in order to put pressure on platforms (or governments) to do a better job of tackling those challenges. Both studies will be launched and available to the public on 26 May.

To what extent have you been in contact with the intermediaries themselves?

Our second stakeholder meeting is coming up where Facebook will be presenting their project Social Science One. This is a partnership between Facebook and the research community that was launched to grant researchers access to data so that they can examine Facebook’s impact on elections and democracy. On the one hand, Facebook deserves credit here because there has been no such partnership between YouTube and researchers, for instance. It is a good example of Facebook stepping up, but there have been some serious shortcomings to this partnership. It was very heavily delayed and researchers were extremely frustrated as these delays had great impact on their work. There were also some broader concerns within the research community of the platform being able to set the research agenda and decide which research questions could be asked. Apart from Facebook, we hope to have representatives from Twitter and Google in our meeting as well, as we think it is important to engage in a constructive dialogue with them. When it comes to transparency and data access, there is an interest on the part of the platforms to have common European rules, so that they do not have to comply with each individual countries’ own initiatives. This is also recognised by the European Commission and perhaps one of the reasons why they are trying to create common rules through the Digital Services Act.

Are there any innovative proposals that AlgorithmWatch and partners will be presenting?
One of the reasons  platforms say they can’t enable better access to data is GDPR, but actually there are lessons that we can learn from other sectors. Lessons that teach us how we can have GDPR compliance and ethical frameworks that allow access data researchers need in order to understand potential threats. Together with our partners at the Institute for Information Law (IViR) / University of Amsterdam, we will be presenting two case studies of best practices in data access frameworks. One is from the environmental sector whereby EU countries’ authorities gather emissions data from big polluters and report it to the authorities at EU level, who then make the data accessible to researchers. The second case focuses on the health sector, which obviously deals with very sensitive data. But examples like Finland’s Findata offer insights into how policymakers can design governance structures that allow researchers to work with sensitive data in an ethical way through secure operating environments. We think that there is a lot to be learned from other sectors and will discuss what might be applicable to platform governance at our meeting.

What would be your ultimate dream with regard to the intermediaries?
I think that civil society and journalists need to be able to monitor the digital media ecosystem in the same way that we need to monitor the physical ecosystem for pollutants. I really like this metaphor and thinking about the information ecosystem as an environment. We will never be able to get rid of ‘pollutants’ entirely. And we may not even agree about what qualifies as a ‘pollutant’ in the public discourse because free speech is important. But I think that if we had a better overview of what the different harms are, and could monitor threats and make sure platforms are held accountable, then we are also better able to safeguard the ecosystem that is—at its core, the bedrock of democracy.

AlgorithmWatch is a non-profit research and advocacy organisation committed to evaluating and shedding light on algorithmic decision-making processes that have a social relevance, meaning they are used either to predict or prescribe human action or to make decisions automatically.