Alessandra Silveira (Editor)
▪
1) What is new about this process? Article 22 GDPR is finally being considered for before the European Court of Justice (ECJ) – and on 16 March 2023, the Advocate General’s Opinion in Case C-634/21 [SCHUFA Holding and Others (Scoring)][1] was published. Article 22 GDPR (apparently) provides a general prohibition of individual decisions based “solely” on automated processing – including profiling – but its provisions raise many doubts to the legal doctrine.[2] Furthermore, Article 22 GDPR is limited to automated decisions that i) produce effects in the legal sphere of the data subject or that ii) significantly affect him/her in a similar manner. The content of the latter provision is not quite clear, but as was suggested by the Data Protection Working Party (WP29), “similar effect” can be interpreted as significantly affecting the circumstances, behaviour or choices of data subjects – for example, decisions affecting a person’s financial situation, including their eligibility for credit.[3] To this extent, the effectiveness of Article 22 GDPR may be very limited until EU case law clarifies i) what a decision taken solely on the basis of automated processing would be, and ii) to what extent this decision produces legal effects or significantly affects the data subject in a similar manner.
2) Why is this case law so relevant? Profiling is an automated processing often used to make predictions about individuals – and may, or may not, lead to automated decisions within the meaning of the Article 22(1) GDPR. It involves collecting information about a person and assessing their characteristics or patterns of behaviour to place them in a particular category or group and to draw on that inference or prediction – whether of their ability to perform a task, their interest or presumed behaviour, etc. To this extent, such automated inferences demand protection as inferred personal data, since they also make it possible to identify someone by association of concepts, characteristics, or contents. The crux of the matter is that people are increasingly losing control over such automated inferences and how they are perceived and evaluated by others. The ECJ has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. As set out below, the approach adopted by the Advocate General has weaknesses – and if the ECJ adopts the conditions suggested by the Advocate General, many reasonable interpretative doubts about Article 22 GDPR will persist.
3) What questions does Article 22 GDPR raise? Does this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right? What is a decision based “solely” on automated processing? (which apparently excludes “largely” or “partially” but not “exclusively” automated decisions). Will the provisions of Article 22 GRPD only apply where there is no relevant human intervention in the decision-making process? If a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?
Continue reading “Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)”