
Jamily Maiara Menegatti Oliveira (Masters in European Union Law from the School of Law of University of Minho)
On 18 November 2025, the European Board for Digital Services, in cooperation with the European Commission, published its first annual report under Article 35(2) of the Digital Services Act (DSA). This report is dedicated to identifying the most prominent and recurring aspects of systemic risks associated to Very Large Online Platforms (VLOPs), as well as the respective mitigation measures.[1] The report holds institutional significance, inaugurating a new reporting cycle under the DSA. More importantly, it illustrates the European Union’s initial steps in incorporating the structural impacts of digital platforms on the exercise of fundamental rights into a risk governance framework.
Although Article 34(1)(b) of the DSA expressly includes media freedom and pluralism within the fundamental rights potentially affected by systemic risks, the report does not treat media as a distinct category of analysis. The reference to media freedom and pluralism is subsumed within the broader context of freedom of expression and information, as well as considerations regarding access to a plurality of opinions, including those originating from media organisations. This methodological approach suggests a functional perspective on media freedom and pluralism, centred on the implications of content dissemination and moderation systems for civic discourse, and raises a legal question as to whether indirect safeguards suffice to uphold the democratic integrity of the digital public sphere.
Continue reading “Algorithmic moderation, shadow banning and systemic risks for media in the European Union: reflections from the first report under Article 35(2) of the Digital Services Act”








