On rebalancing powers in the digital ecosystem in recent CJEU case law (or on the battle between David and Goliath)

Alessandra Silveira  (Editor of this official blog, Academic Coordinator of Jean Monnet Centre of Excellence “Digital Citizenship & Technological Sustainability” - CitDig, Erasmus+) 
           

There is no doubt that European Union (EU) law is committed to a certain rebalancing of powers in the digital ecosystem. And why is that? Because today there is a clear imbalance of power in favour of digital service providers, which requires a strengthening of the position of users in their relationship with providers. The Internet has become a space made up of platforms, where unilaterally established and non-transparent business models are developed. This attempt to rebalance power in the digital ecosystem is an exercise in social justice that only the EU can foster. And this trend is particularly noticeable in the field of personal data protection.

The emergence of a business model based on data – and profiling based on inferred data – reveals the imbalance of power between users and platforms. This has led some authors to recognise the quasi-public powers exercised by technology companies on the Internet: they regulate, enforce and resolve conflicts of interest, acting in an uncontrolled way that we would not even allow public authorities to do in the context of the rule of law. But the problem must be contextualised: what is personal data?

Continue reading “On rebalancing powers in the digital ecosystem in recent CJEU case law (or on the battle between David and Goliath)”

Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)

Alessandra Silveira (Editor)
           

1) What is new about this process? Article 22 GDPR is finally being considered for before the European Court of Justice (ECJ) – and on 16 March 2023, the Advocate General’s Opinion in Case C-634/21 [SCHUFA Holding and Others (Scoring)][1] was published. Article 22 GDPR (apparently) provides a general prohibition of individual decisions based “solely” on automated processing – including profiling – but its provisions raise many doubts to the legal doctrine.[2] Furthermore, Article 22 GDPR is limited to automated decisions that i) produce effects in the legal sphere of the data subject or that ii) significantly affect him/her in a similar manner. The content of the latter provision is not quite clear, but as was suggested by the Data Protection Working Party (WP29), “similar effect” can be interpreted as significantly affecting the circumstances, behaviour or choices of data subjects – for example, decisions affecting a person’s financial situation, including their eligibility for credit.[3] To this extent, the effectiveness of Article 22 GDPR may be very limited until EU case law clarifies i) what a decision taken solely on the basis of automated processing would be, and ii) to what extent this decision produces legal effects or significantly affects the data subject in a similar manner.

2) Why is this case law so relevant? Profiling is an automated processing often used to make predictions about individuals – and may, or may not, lead to automated decisions within the meaning of the Article 22(1) GDPR. It involves collecting information about a person and assessing their characteristics or patterns of behaviour to place them in a particular category or group and to draw on that inference or prediction – whether of their ability to perform a task, their interest or presumed behaviour, etc. To this extent, such automated inferences demand protection as inferred personal data, since they also make it possible to identify someone by association of concepts, characteristics, or contents. The crux of the matter is that people are increasingly losing control over such automated inferences and how they are perceived and evaluated by others. The ECJ has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. As set out below, the approach adopted by the Advocate General has weaknesses – and if the ECJ adopts the conditions suggested by the Advocate General, many reasonable interpretative doubts about Article 22 GDPR will persist.

3) What questions does Article 22 GDPR raise?  Does this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right?  What is a decision based “solely” on automated processing? (which apparently excludes “largely” or “partially” but not “exclusively” automated decisions). Will the provisions of Article 22 GRPD only apply where there is no relevant human intervention in the decision-making process? If a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

Continue reading “Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)”