On rebalancing powers in the digital ecosystem in recent CJEU case law (or on the battle between David and Goliath)

Alessandra Silveira  (Editor of this official blog, Academic Coordinator of Jean Monnet Centre of Excellence “Digital Citizenship & Technological Sustainability” - CitDig, Erasmus+) 
           

There is no doubt that European Union (EU) law is committed to a certain rebalancing of powers in the digital ecosystem. And why is that? Because today there is a clear imbalance of power in favour of digital service providers, which requires a strengthening of the position of users in their relationship with providers. The Internet has become a space made up of platforms, where unilaterally established and non-transparent business models are developed. This attempt to rebalance power in the digital ecosystem is an exercise in social justice that only the EU can foster. And this trend is particularly noticeable in the field of personal data protection.

The emergence of a business model based on data – and profiling based on inferred data – reveals the imbalance of power between users and platforms. This has led some authors to recognise the quasi-public powers exercised by technology companies on the Internet: they regulate, enforce and resolve conflicts of interest, acting in an uncontrolled way that we would not even allow public authorities to do in the context of the rule of law. But the problem must be contextualised: what is personal data?

Continue reading “On rebalancing powers in the digital ecosystem in recent CJEU case law (or on the battle between David and Goliath)”

Editorial of March 2024

By the Alessandra Silveira 

On inferred personal data and the difficulties of EU law in dealing with this matter

The right not to be subject to automated decisions was considered for the first time before the Court of Justice of the European Union (CJEU) in the recent SCHUFA judgment. Article 22 GDPR (on individual decisions based solely on automated processing, including profiling) always raised many doubts to legal scholars:[1] i) what a decision taken “solely” on the basis of automated processing would be?; ii) would this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right?; iii) to what extent this automated decision produces legal effects or significantly affects the data subject in a similar manner?; iv) will the provisions of Article 22 GDPR only apply where there is no relevant human intervention in the decision-making process?; v) if a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

To these doubts a German court has added a few more. SCHUFA is a private company under German law which provides its contractual partners with information on the creditworthiness of third parties, in particular, consumers. To that end, it establishes a prognosis on the probability of a future behaviour of a person (‘score’), such as the repayment of a loan, based on certain characteristics of that person, on the basis of mathematical and statistical procedures. The establishment of scores (‘scoring’) is based on the assumption that, by assigning a person to a group of other persons with comparable characteristics who have behaved in a certain way, similar behaviour can be predicted.[2]

Continue reading “Editorial of March 2024”